## Description The collection make use of REST API exposed by AWS IAM to fetch data related to IAM entities and their access to AWS resources. The collection processes fetched data using “Pre-request & Post-requests Tests” to detect nonalignment with the AWS IAM audit checklist. The audit result is pushed to Slack for further actions for DevOps Team. --- ## Environment Variables | Environment Key | Value | | - | - | | aws_access_key_id | Read: https://docs.aws.amazon.com/general/latest/gr/aws-sec-cred-types.html#access-keys-and-secret-access-keys | | aws_secret_access_key | Read: https://docs.aws.amazon.com/general/latest/gr/aws-sec-cred-types.html#access-keys-and-secret-access-keys | | slack_url | Read: https://api.slack.com/incoming-webhooks | --- **IAM Audit Checklist** ``` 1. Check if root user access keys are disabled. 2. Check if the strong password policy is set for an AWS account. Minimum password length (10) Require at least one lowercase letter Require at least one number Require at least one non-alphanumeric character Allow users to change their own password Enable password expiration (365 days) Prevent password reuse (24) 3. Identify IAM users(humans) whose active access keys are not being rotated for every 45 days. 4. Identify if IAM users(bots) whose active access keys are not being rotated for every 180 days. 5. Identify IAM users(humans) who are inactive for more than 180 days. 6. Identify IAM users(bots) who are inactive for more than 180 days. 7. Check if the MFA is enabled for IAM users(humans). 8. Identify all the permissive policies attached to the roles. Permissive policies : where the value of “Principle” or "Action" is “ * ” (wildcard character). 9. Identify unused and not recently used (not last accessed in 180 days) permissions provisioned via policies attached with IAM entity (user, group, role, or policy) using service Last Accessed Data. ```
A simple collection to monitor Elastic Beanstalk environments. Auditing environments involves the following steps: 1. Fetch all enviromments using the provided access key id and secret token. 2. If there are additional enviroments to be fetched beyond the response of the first fetch request, extract the pagination token from the response and use it to repeat the fetch request. 3. Once there are no more enviroments to be fetched, save the list of enviroments as an environment variable. 4. Iterate over the list, repeating the configuration description request for each environment in the list. Once the config for an environment is known, compare it to a set of expected results to ensure compliance. PS: This collection is most useful when run as a [monitor](https://www.getpostman.com/docs/postman/monitors/monitoring_apis_websites), so as to run such audits on a periodic basis. You can also configure the inbuilt Slack integration for Postman monitors, so as to recieve instance alerts when things are amiss. # Required environment variables: This collection requires the following environment variables: | Name | Description | Required | |:----------:|:----------------------------------------------------------------------------:|:--------:| | id | The access key id for the audit AWS user | Yes | | key | The secret access key for the audit AWS user | Yes | | awsRegion | The region to audit environments in. Defaults to us-east-1 | No | | maxRecords | The number of environments to retrieve per fetch call. Defaults to 100 (max) | No |
A simple collection to monitor Elastic Beanstalk environments. Auditing environments involves the following steps: 1. Fetch all enviromments using the provided access key id and secret token. 2. If there are additional enviroments to be fetched beyond the response of the first fetch request, extract the pagination token from the response and use it to repeat the fetch request. 3. Once there are no more enviroments to be fetched, save the list of enviroments as an environment variable. 4. Iterate over the list, repeating the configuration description request for each environment in the list. Once the config for an environment is known, compare it to a set of expected results to ensure compliance. PS: This collection is most useful when run as a [monitor](https://www.getpostman.com/docs/postman/monitors/monitoring_apis_websites), so as to run such audits on a periodic basis. You can also configure the inbuilt Slack integration for Postman monitors, so as to recieve instance alerts when things are amiss. # Required environment variables: This collection requires the following environment variables: | Name | Description | Required | |:----------:|:----------------------------------------------------------------------------:|:--------:| | id | The access key id for the audit AWS user | Yes | | key | The secret access key for the audit AWS user | Yes | | awsRegion | The region to audit environments in. Defaults to us-east-1 | No | | maxRecords | The number of environments to retrieve per fetch call. Defaults to 100 (max) | No |
This Open API specification serves as a standardized framework for describing and defining the capabilities of the AWS Deploy Postman Collection. It outlines how clients can interact with the API, providing a structured approach to document endpoints, operations, and other integration details. This specification is intended to promote clarity, consistency, and ease of use for developers and consumers of the API, ensuring efficient communication between systems.
This collection recommends which Caches Clusters in AWS ElastiCache are inactive in a given range of time with few externally specified constraints. It uses CloudWatch to get the statistics of clusters and currently is based on 3 metrics: * CacheHits * CPUUtilization * CurrItems Setup ==================== You need to set the following environment variables: 1. AWS Credentials a. `accessKeyID` b. `secretAccessKey` 2. `slackWebHookURL` *Description: Collection notifies its results on this webhook.* 3. `region` *Description: AWS region to monitor - Ex: "us-east-1"* 4. `days` *Description: Range to monitor clusters starting from today to back N days* 5. `period` *Description: Time interval in which AWS collects data points and aggregates them. Ex: 36000 (seconds)* Miscellaneous ==================== 1. AWS allows **1440 data points** in a single request. Either make period long or days short so that data points aggregated are less than their limit. 2. Parameters below have **default values** in case not mentioned in the environment. You can modify these in the environment if required. However, these are recommended. <table> <tr> <th>Variable Name</th> <th>Default Value</th> </tr> <tr> <td>days</td> <td>14</td> </tr> <tr> <td>period</td> <td>36000</td> </tr> <tr> <td>region</td> <td>us-east-1</td> </tr> </table>
This Open API specification serves as a standardized framework for describing and defining the capabilities of the Avon Health APIs. It outlines how clients can interact with the API, providing a structured approach to document endpoints, operations, and other integration details. This specification is intended to promote clarity, consistency, and ease of use for developers and consumers of the API, ensuring efficient communication between systems.
Contact Support: Name: No Contact Email: email@example.com
AWS Config provides a way to keep track of the configurations of all the AWS resources associated with your AWS account. You can use AWS Config to get the current and historical configurations of each AWS resource and also to get information about the relationship between the resources. An AWS resource can be an Amazon Compute Cloud (Amazon EC2) instance, an Elastic Block Store (EBS) volume, an elastic network Interface (ENI), or a security group.
This Open API specification serves as a standardized framework for describing and defining the capabilities of the AWS Deploy Postman Collection. It outlines how clients can interact with the API, providing a structured approach to document endpoints, operations, and other integration details. This specification is intended to promote clarity, consistency, and ease of use for developers and consumers of the API, ensuring efficient communication between systems.
AWS Database Migration Service (AWS DMS) can migrate your data to and from the most widely used commercial and open-source databases such as Oracle, PostgreSQL, Microsoft SQL Server, Amazon Redshift, MariaDB, Amazon Aurora, MySQL, and SAP Adaptive Server Enterprise (ASE). The service supports homogeneous migrations such as Oracle to Oracle, as well as heterogeneous migrations between different database platforms, such as Oracle to MySQL or SQL Server to PostgreSQL.
AWS Data Pipeline configures and manages a data-driven workflow called a pipeline. AWS Data Pipeline handles the details of scheduling and ensuring that data dependencies are met so that your application can focus on processing the data.
Contact Support: Name: No Contact Email: email@example.com
AWS Database Migration Service (AWS DMS) can migrate your data to and from the most widely used commercial and open-source databases such as Oracle, PostgreSQL, Microsoft SQL Server, Amazon Redshift, MariaDB, Amazon Aurora, MySQL, and SAP Adaptive Server Enterprise (ASE). The service supports homogeneous migrations such as Oracle to Oracle, as well as heterogeneous migrations between different database platforms, such as Oracle to MySQL or SQL Server to PostgreSQL.
AWS Data Pipeline configures and manages a data-driven workflow called a pipeline. AWS Data Pipeline handles the details of scheduling and ensuring that data dependencies are met so that your application can focus on processing the data.
AWS Config provides a way to keep track of the configurations of all the AWS resources associated with your AWS account. You can use AWS Config to get the current and historical configurations of each AWS resource and also to get information about the relationship between the resources. An AWS resource can be an Amazon Compute Cloud (Amazon EC2) instance, an Elastic Block Store (EBS) volume, an elastic network Interface (ENI), or a security group.
Contact Support: Name: No Contact Email: email@example.com
AWS CodePipeline is a continuous delivery service that enables you to model, visualize, and automate the steps required to release your software.
AWS CodePipeline is a continuous delivery service that enables you to model, visualize, and automate the steps required to release your software.
Contact Support: Name: No Contact Email: email@example.com
This Open API specification serves as a standardized framework for describing and defining the capabilities of the AWS CodeBuild. It outlines how clients can interact with the API, providing a structured approach to document endpoints, operations, and other integration details. This specification is intended to promote clarity, consistency, and ease of use for developers and consumers of the API, ensuring efficient communication between systems.
This Open API specification serves as a standardized framework for describing and defining the capabilities of the AWS CloudWatch. It outlines how clients can interact with the API, providing a structured approach to document endpoints, operations, and other integration details. This specification is intended to promote clarity, consistency, and ease of use for developers and consumers of the API, ensuring efficient communication between systems.
This Open API specification serves as a standardized framework for describing and defining the capabilities of the AWS CodeBuild. It outlines how clients can interact with the API, providing a structured approach to document endpoints, operations, and other integration details. This specification is intended to promote clarity, consistency, and ease of use for developers and consumers of the API, ensuring efficient communication between systems.
This Open API specification serves as a standardized framework for describing and defining the capabilities of the AWS CloudWatch. It outlines how clients can interact with the API, providing a structured approach to document endpoints, operations, and other integration details. This specification is intended to promote clarity, consistency, and ease of use for developers and consumers of the API, ensuring efficient communication between systems.
AWS CloudHSM offers secure cryptographic key storage for customers by providing managed hardware security modules in the AWS Cloud.
AWS CloudHSM offers secure cryptographic key storage for customers by providing managed hardware security modules in the AWS Cloud.
With AWS CloudTrail, you can monitor your AWS deployments in the cloud by getting a history of AWS API calls for your account, including API calls made via the AWS Management Console, the AWS SDKs, the command line tools, and higher-level AWS services. You can also identify which users and accounts called AWS APIs for services that support CloudTrail, the source IP address the calls were made from, and when the calls occurred. You can integrate CloudTrail into applications using the API, automate trail creation for your organization, check the status of your trails, and control how administrators turn CloudTrail logging on and off.
Amazon EMR is a web service that makes it easy to process large amounts of data efficiently. Amazon EMR uses Hadoop processing combined with several AWS products to do tasks such as web indexing, data mining, log file analysis, machine learning, scientific simulation, and data warehousing.
This Open API specification serves as a standardized framework for describing and defining the capabilities of the AWS Auto Scaling API. It outlines how clients can interact with the API, providing a structured approach to document endpoints, operations, and other integration details. This specification is intended to promote clarity, consistency, and ease of use for developers and consumers of the API, ensuring efficient communication between systems.
Amazon CloudFront speeds up distribution of your static and dynamic web content, such as .html, .css, .php, image, and media files. When users request your content, CloudFront delivers it through a worldwide network of edge locations that provide low latency and high performance.
AWS CloudFormation allows you to create and manage AWS infrastructure deployments predictably and repeatedly. You can use AWS CloudFormation to leverage AWS products, such as Amazon Elastic Compute Cloud, Amazon Elastic Block Store, Amazon Simple Notification Service, Elastic Load Balancing, and Auto Scaling to build highly-reliable, highly scalable, cost-effective applications without creating or configuring the underlying AWS infrastructure.
AWS CloudFormation allows you to create and manage AWS infrastructure deployments predictably and repeatedly. You can use AWS CloudFormation to leverage AWS products, such as Amazon Elastic Compute Cloud, Amazon Elastic Block Store, Amazon Simple Notification Service, Elastic Load Balancing, and Auto Scaling to build highly-reliable, highly scalable, cost-effective applications without creating or configuring the underlying AWS infrastructure.
Welcome to the AWS Certificate Manager (ACM) service. ACM handles the complexity of creating and managing public SSL/TLS certificates for your AWS based websites and applications. You can use public certificates provided by ACM (ACM certificates) or certificates that you import into ACM. ACM certificates can secure multiple domain names and multiple names within a domain. You can also use ACM to create wildcard SSL certificates that can protect an unlimited number of subdomains.
AWS Batch enables you to run batch computing workloads on the AWS Cloud. Batch computing is a common way for developers, scientists, and engineers to access large amounts of compute resources, and AWS Batch removes the undifferentiated heavy lifting of configuring and managing the required infrastructure. AWS Batch will be familiar to users of traditional batch computing software. This service can efficiently provision resources in response to jobs submitted in order to eliminate capacity constraints, reduce compute costs, and deliver results quickly.
AWEX API
This Open API specification serves as a standardized framework for describing and defining the capabilities of the AWS API Gateway. It outlines how clients can interact with the API, providing a structured approach to document endpoints, operations, and other integration details. This specification is intended to promote clarity, consistency, and ease of use for developers and consumers of the API, ensuring efficient communication between systems.
This is the AVEVA LFM Connect API.
This Open API specification serves as a standardized framework for describing and defining the capabilities of the Asgard Storage. It outlines how clients can interact with the API, providing a structured approach to document endpoints, operations, and other integration details. This specification is intended to promote clarity, consistency, and ease of use for developers and consumers of the API, ensuring efficient communication between systems.
This is a generated connector for [Avaza API v1](https://api.avaza.com/swagger/ui/index) OpenAPI specification. Avaza API allows to collaborate on projects, chat, schedule resources, track time, manage expenses & invoice customers.
This is a generated connector for [Avalara AvaTax API v2](https://developer.avalara.com/api-reference/avatax/rest/v2/) OpenAPI specification. Avalara AvaTax is sales tax software that automates tax calculations and the tax filing process.
This is a generated connector for [atSpoke API v0.1.0](https://askspoke.com/api/reference) OpenAPI specification. The atSpoke REST API provides a broad set of operations including: - Creation, manipulation, and deletion of requests in atSpoke - Management of users in atSpoke - Creation, manipulation, and deletion of knowledge resources in atSpoke The public API is served from https://api.askspoke.com/api/v1 – note `api` in the host name, not your usual organization id.
This is a generated connector for [Automata API v1.0.1](https://byautomata.io/api/) OpenAPI specification. The Automata API provides the capability to identify the market intelligence.
This Open API specification serves as a standardized framework for describing and defining the capabilities of the ArvanCloud. It outlines how clients can interact with the API, providing a structured approach to document endpoints, operations, and other integration details. This specification is intended to promote clarity, consistency, and ease of use for developers and consumers of the API, ensuring efficient communication between systems.
This Open API specification serves as a standardized framework for describing and defining the capabilities of the ATS APIs. It outlines how clients can interact with the API, providing a structured approach to document endpoints, operations, and other integration details. This specification is intended to promote clarity, consistency, and ease of use for developers and consumers of the API, ensuring efficient communication between systems.
This Open API specification serves as a standardized framework for describing and defining the capabilities of the Asgard. It outlines how clients can interact with the API, providing a structured approach to document endpoints, operations, and other integration details. This specification is intended to promote clarity, consistency, and ease of use for developers and consumers of the API, ensuring efficient communication between systems.
This API.
This is a generated connector for [Asana API v1.0](https://developers.asana.com/docs) OpenAPI specification. This API enables you to help teams organize, track and manage their work. For additional help getting started with the API, visit [Asana API](https://developers.asana.com).
Nós aqui do Asaas adoramos poupar tempo de todo mundo, por isso fizemos essa collection com todas as nossas APIs. 👍 # Documentação A documentação completa pode ser acessada [neste link.](https://asaasv3.docs.apiary.io/) Ambientes disponíveis: * Local * Sandbox * Produção Não esquece de consultar as variáveis da coleção. ## Autenticação Para testar os exemplos aqui descritos é necessário ter uma conta no Asaas. Caso você ainda não tenha, basta [criar uma conta clicando aqui](https://www.asaas.com/onboarding/createAccount?utm_source=postman) ou [aqui para criar em nossa sandbox](https://sandbox.asaas.com/onboarding/createAccount). A autenticação é feita através do fornecimento da sua API Key. Ela deve ser transmitida em todas as requisições no header \`access_token\`. Caso a API Key seja inválida ou não seja informada, o Asaas retornará `HTTP 401`. As API Keys são distintas entre os ambientes de Sandbox e Produção, portanto lembre-se de alterá-la quando mudar a URL. Para obter sua API Key [acesse a Aba Integração na área Minha Conta](https://www.asaas.com/config/index?tab=pushNotification). Na collection sua API Key pode ser informada na área de ambientes. > ***Atenção***: > Sua API Key carrega muitos privilégios, portanto certifique-se de mantê-la protegida. Não informe ela em atendimentos e nem a exponha no front-end da sua aplicação. Além disso, não é possível recuperá-la caso perdida, sendo necessário a geração de uma nova.
## **Ерөнхий мэдээлэл** Артлаб ХХК -с тухайн харилцагчид гаргаж өгсөн нэвтрэх хэрэглэгчийн эрхээр дамжуулан нээлттэй API-г онлайн орчинд ашиглах боломжтой. **Шаардлага:** \- Програм нь REST API ашиглах \- JSON форматаар мэдээлэл солилцох **Тест хийхэд шаардлагатай зүйлс:** \- API хөгжүүлэлтийн тест шалгах платформ, жишээ нь: Postman \- Тестийн эрх (Хүсэлт явуулна)
This Open API specification serves as a standardized framework for describing and defining the capabilities of the ARCAPLANET. It outlines how clients can interact with the API, providing a structured approach to document endpoints, operations, and other integration details. This specification is intended to promote clarity, consistency, and ease of use for developers and consumers of the API, ensuring efficient communication between systems.
This Open API specification serves as a standardized framework for describing and defining the capabilities of the AquiPaga SandboxAPI. It outlines how clients can interact with the API, providing a structured approach to document endpoints, operations, and other integration details. This specification is intended to promote clarity, consistency, and ease of use for developers and consumers of the API, ensuring efficient communication between systems.