8/7/2024, 12:00:00 AM ~ 8/8/2024, 12:00:00 AM (UTC)
Recent Announcements
Amazon CloudWatch Application Signals now supports Amazon Bedrock
Amazon CloudWatch Application Signals now supports Amazon Bedrock, enabling users to troubleshoot errors and slow performance in generative AI applications. Amazon Bedrock is a fully managed service that offers foundation models (FMs) built by leading AI companies, such as Anthropic, Meta, and Amazon along with other tools for building generative AI applications. For users with generative AI applications relying on Bedrock FMs, this enhancement provides a deeper understanding of how failures such as model validation exceptions or how latency in different models impact end user experience.\n Application Signals, provides out-of-the-box dashboards to correlate telemetry across metrics, traces, logs, real-user monitoring, and synthetic monitoring for your application and its dependencies, such as Amazon Simple Queue Service (SQS), Amazon S3, or Amazon Bedrock, speeding up troubleshooting application disruption. For example, an application developer operating an LLM (Large Language Model) application that invokes Bedrock FMs can track if their customer support API is experiencing any issues. They can then drill into the precisely correlated trace contributing to the error, along with correlated logs, to establish the root cause, such as invalid model inputs or long response times from LLM models, leading to poor end user experience. Tracking model performance within your application allows you to evaluate different models and choose the best one for your use case, optimizing for cost and customer experience. To learn more, see documentation to enable Amazon CloudWatch Application Signals on your applications interacting with Amazon Bedrock models. To try Application Signals on a sample application visit AWS One Observability Workshop.
Announcing the general availability of AWS Backup logically air-gapped vault
Today, AWS Backup announces the general availability of logically air-gapped vault, a new type of AWS Backup vault that allows secure sharing of backups across accounts and organizations, supporting direct restore to help reduce recovery time from a data loss event. Logically air-gapped vault stores immutable backup copies that are locked by default, and isolated with encryption using AWS owned keys.\n You can get started with logically air-gapped vault using the AWS Backup console, API, or CLI. Target backups to a logically air-gapped vault by specifying it as a copy destination in your backup plan. Share the vault for recovery or restore testing with other accounts using AWS Resource Access Manager (RAM). Once shared, you can initiate direct restore jobs from that account, eliminating the overhead of copying backups first. AWS Backup support for logically air-gapped vault is available in the following Regions: US East (N. Virginia, Ohio), US West (N. California, Oregon), Africa (Cape Town), Asia Pacific (Hong Kong, Hyderabad, Jakarta, Melbourne, Mumbai, Osaka, Seoul, Singapore, Sydney, Tokyo), Canada (Central), Europe (Frankfurt, Ireland, London, Milan, Paris, Spain, Stockholm, Zurich), Middle East (Bahrain, UAE), Israel (Tel Aviv) and South America (Sao Paulo). It currently supports Amazon Elastic Compute Cloud (EC2), Amazon Elastic Block Store (EBS), Amazon Aurora, Amazon DocumentDB, Amazon Neptune, AWS Storage Gateway, Amazon Simple Storage Service (S3), Amazon Elastic File System (EFS), Amazon DynamoDB, Amazon Timestream, AWS CloudFormation, and VMware. For more information visit the AWS Backup product page, documentation, and launch blog.
Amazon RDS for Oracle supports memory optimized R6i instance types in the AWS GovCloud (US) Regions
Starting today, Amazon Relational Database Service (RDS) for Oracle now supports memory optimized R6i instance types in the AWS GovCloud (US-East) and AWS GovCloud (US-West) regions.\n The memory optimized R6i instance types feature up to 8x the RAM per vCPU of the existing R6i instance types to better fit your workloads. Many Oracle database workloads require high memory, storage, and I/O bandwidth but can safely reduce the number of vCPUs without impacting application performance. Memory optimized R6i instances come in various configurations from 2 vCPUs to 48 vCPUs, memory from 32 GiB to 1024 GiB, and up to 64:1 memory-to-vCPU ratio. These configurations will allow you to right-size instances for your Oracle workloads. Memory optimized R6i instances are available in Bring Your Own License (BYOL) license model and for both Oracle Database Enterprise Edition (EE) and Oracle Database Standard Edition 2 (SE2) editions. You can launch additional memory configurations of the R6i instance class in the Amazon RDS Management Console or using the AWS CLI. Amazon RDS for Oracle is a fully managed commercial database that makes it easy to set up, operate, and scale Oracle deployments in the cloud. To learn more about Amazon RDS for Oracle, read RDS for Oracle User Guide and visit Amazon RDS for Oracle Pricing for available instance configurations, pricing details, and region availability.
Claude 3.5 Sonnet and Claude 3 Haiku now available in more regions
Beginning today, Amazon Bedrock customers in US West (Oregon), Europe (Frankfurt), Asia Pacific (Tokyo), and Asia Pacific (Singapore) can now access Claude 3.5 Sonnet. Additionally, Amazon Bedrock customers in Asia Pacific (Tokyo) and Asia Pacific (Singapore) can now access Claude 3 Haiku.\n Claude 3.5 Sonnet is Anthropic’s latest foundation model and ranks among the most intelligent in the world. With Claude 3.5 Sonnet, you can now get intelligence better than Claude 3 Opus, at one fifth the cost. Claude 3 Haiku is Anthropic’s most compact model, and one of the fastest, most affordable options on the market for its intelligence category. Amazon Bedrock is a fully managed service that offers a choice of high-performing large language models (LLMs) and other foundation models from leading AI companies like AI21 Labs, Anthropic, Cohere, Meta, Mistral AI, Stability AI, as well as Amazon via a single API. Amazon Bedrock also provides a broad set of capabilities customers need to build generative AI applications with security, privacy, and responsible AI built in. These capabilities help you build tailored applications for multiple use cases across different industries, helping organizations unlock sustainable growth from generative AI while maintaining privacy and security. To learn more, read the Claude in Amazon Bedrock product page and documentation. To get started with Claude 3.5 Sonnet and Claude 3 Haiku in Amazon Bedrock, visit the Amazon Bedrock console.
Announcing delegated administrator for Cost Optimization Hub
Cost Optimization Hub is an AWS Billing and Cost Management feature that helps you consolidate and prioritize cost optimization recommendations, so that you can get the most out of your AWS spend. Starting today, you can designate a member account as the delegated administrator, allowing that account to view cost optimization recommendations in the Cost Optimization Hub with administrator privileges, giving you greater flexibility to identify resource optimization opportunities centrally.\n Delegating an administrator allows you to manage Cost Optimization Hub independently of the management account and incorporate AWS security best practices, which recommend delegating responsibilities outside of the management account where possible. Cost Optimization Hub delegated administrators can easily identify, filter, and aggregate over 15 types of AWS cost optimization recommendations, such as EC2 instance rightsizing recommendations, idle resource recommendations, and Savings Plans recommendations, across your AWS accounts and AWS Regions through a single console page. Delegated administrator for Cost Optimization Hub is available in all AWS Regions where Cost Optimization Hub and AWS Organizations are available. To learn more about delegated administrator for Cost Optimization Hub, see the user guide.
Amazon EC2 High Memory instances now available in Europe (Paris) Region
Starting today, Amazon EC2 High Memory instances with 9TiB of memory (u-9tb1.112xlarge) are now available in Europe (Paris) region. Customers can start using these new High Memory instances with On Demand and Savings Plan purchase options.\n Amazon EC2 High Memory instances are certified by SAP for running Business Suite on HANA, SAP S/4HANA, Data Mart Solutions on HANA, Business Warehouse on HANA, and SAP BW/4HANA in production environments. For details, see the Certified and Supported SAP HANA Hardware Directory. For information on how to get started with your SAP HANA migration to EC2 High Memory instances, view the Migrating SAP HANA on AWS to an EC2 High Memory Instance documentation. To hear from Steven Jones, GM for SAP on AWS on what this launch means for our SAP customers, you can read his launch blog.
Amazon EFS now supports up to 30 GiB/s (a 50% increase) of read throughput
Amazon EFS provides serverless, fully elastic file storage that makes it simple to set up and run file workloads in the AWS cloud. In March 2024, we increased the Elastic Throughput read throughput limit to 20 GiB/s (from 10 GiB/s) to support the growing demand for read-heavy workloads such as AI and machine learning. Now, we are further increasing the read throughput to 30 GiB/s, extending EFS’s simple, fully elastic, and provisioning-free experience to support throughput-intensive AI and machine learning workloads for model training, inference, financial analytics, and genomic data analysis.\n The increased throughput limits are immediately available for EFS file systems using the Elastic Throughput mode in the US East (N. Virginia), US East (Ohio), US West (Oregon), EU West (Dublin), and Asia Pacific (Tokyo) Regions. To learn more, see the Amazon EFS Documentation or create a file system using the Amazon EFS Console, API, or AWS CLI.
Amazon QuickSight now includes nested filters
Amazon QuickSight includes a new advanced filter type: nested filters. Authors can use a nested filter to use one field in a dataset to filter another field in the dataset. You might know this by another name such as in SQL this would be known as a correlated sub-query and in shopping analysis this would be known as market basket analysis.\n Nested filtering enables authors to show additional contextual data, rather than filtering it out if it doesn’t meet an initial condition. This can be useful in many different scenarios including market basket analysis where it is now possible to find out sales quantity by product for those customers who have purchased a specific product or who did not purchase a specific product. It is now also possible to find out the group of customers who did not purchase any one of the selected product list or only purchased a specific list of products. Nested filters are now available in all supported Amazon QuickSight regions - US East (Ohio and N. Virginia), US West (Oregon), Asia Pacific (Mumbai, Seoul, Singapore, Sydney and Tokyo), Canada (Central), Europe (Frankfurt, Ireland and London), South America (São Paulo) and AWS GovCloud (US-West). See here for QuickSight regional endpoints. For more on how to set up a nested filter, go to our documentation here and blog post here.
AWS Blogs
AWS Japan Blog (Japanese)
- Detailed Explanation: Government Cloud Name Resolution Edition
- AWS Weekly Roundup: Amazon Q Business, AWS CloudFormation, Amazon WorkSpaces Updates, etc. (August 5, 2024)
- AWS Certification: Adding a New Type of Exam Question
AWS Architecture Blog
AWS Cloud Operations & Migrations Blog
- Streamline compliance management with AWS Config custom rules and conformance packs
- Improve Amazon Bedrock Observability with Amazon CloudWatch AppSignals
AWS Big Data Blog
- OpenSearch optimized instance (OR1) is game changing for indexing performance and cost
- AWS Glue mutual TLS authentication for Amazon MSK
AWS Compute Blog
AWS Database Blog
- Better Together: Amazon SageMaker Canvas and RDS for SQL Server, a predictive ML model sample use case
- Power real-time vector search capabilities with Amazon MemoryDB
AWS DevOps Blog
- Generating Accurate Git Commit Messages with Amazon Q Developer CLI Context Modifiers
- Implementing Identity-Aware Sessions with Amazon Q Developer
- How to use Amazon Q Developer to deploy a Serverless web application with AWS CDK
Integration & Automation
AWS Machine Learning Blog
- Improve AI assistant response accuracy using Knowledge Bases for Amazon Bedrock and a reranking model
- Automate the machine learning model approval process with Amazon SageMaker Model Registry and Amazon SageMaker Pipelines
AWS for M&E Blog
AWS Security Blog
AWS Storage Blog
Open Source Project
AWS CLI
OpenSearch
Amplify for JavaScript
- tsc-compliance-test@0.1.47
- aws-amplify@6.5.0
- @aws-amplify/storage@6.6.0
- @aws-amplify/pubsub@6.1.17
- @aws-amplify/predictions@6.1.17
- @aws-amplify/notifications@2.0.42
- @aws-amplify/interactions@6.0.41
- @aws-amplify/geo@3.0.42
- @aws-amplify/datastore-storage-adapter@2.1.44
- @aws-amplify/datastore@5.0.44