3/13/2025, 12:00:00 AM ~ 3/14/2025, 12:00:00 AM (UTC)
Recent Announcements
AppSync Events adds publishing over WebSocket for real-time pub/sub
AWS AppSync Events is a fully managed service that allows developers to create secure and performant WebSocket APIs. Starting today, developers can use their AppSync Events APIs to publish events directly over WebSocket connections, complementing the existing HTTP API publishing capability. This enhancement enables applications to both publish and subscribe to events using a single WebSocket connection, streamlining the implementation of real-time features.\n The new WebSocket publishing capability simplifies the development of collaborative applications such as chat systems, multiplayer games, and shared document editing. Developers can now maintain a single connection for bi-directional communication, reducing complexity and improving performance by eliminating the need to manage separate connections for publishing and subscribing to events. This approach helps reduce latency in real-time interactive applications by removing the overhead of establishing new HTTP connections for each event publication. This feature is now available in all AWS Regions where AWS AppSync is supported. To get started, developers can use their favorite WebSocket client. For more information, view our new blog post and visit the AWS AppSync documentation for detailed implementation examples and best practices.
Amazon S3 reduces pricing for S3 object tagging by 35%
Amazon S3 reduces pricing for S3 object tagging by 35% in all AWS Regions to $0.0065 per 10,000 tags per month. Object tags are key-value pairs applied to S3 objects that can be created, updated, or deleted at any time during the lifetime of the object.\n S3 object tags help you logically group data for a variety of reasons such as to apply IAM policies to provide fine-grained access, to specify tag-based filters to manage object lifecycle rules, and to selectively replicate data to another AWS Region. Additionally, in AWS Regions where S3 Metadata is available, you can easily capture and query custom metadata that is stored as object tags. S3 object tags are available in all AWS Regions including the AWS China and AWS GovCloud (US) Regions. This new pricing takes effect automatically in the monthly billing cycle starting on March 1, 2025. To learn more about object tags, refer to the documentation. For more pricing details, visit the S3 pricing page.
Amazon Bedrock now available in the Europe (Milan) and Europe (Spain) Regions
Customers can use regional processing profiles for Amazon Nova understanding models (Amazon Nova Lite, Amazon Nova Micro, and Amazon Nova Pro) in the Europe (Milan) and Europe (Spain) regions.\n Amazon Bedrock is a fully managed service that offers a choice of high-performing large language models (LLMs) and other FMs from leading AI companies like AI21 Labs, Anthropic, Cohere, Meta, Mistral AI, Stability AI, as well as Amazon via a single API. Amazon Bedrock also provides a broad set of capabilities customers need to build generative AI applications with security, privacy, and responsible AI built in. These capabilities help you build tailored applications for multiple use cases across different industries, helping organizations unlock sustained growth from generative AI while ensuring customer trust and data governance. To get started, visit the Amazon Bedrock page and see the Amazon Bedrock documentation for more details.
Amazon Route 53 Traffic Flow introduces a new visual editor to improve DNS policy editing
Amazon Route 53 Traffic Flow now offers an enhanced user interface for improved DNS traffic policy editing. Route 53 Traffic Flow is a network traffic management feature which simplifies the process of creating and maintaining DNS records in large and complex configurations by providing users with an interactive DNS policy management flow chart in their web browsers. With this release, you can more easily understand and change the way traffic is routed between users and endpoints using the new features of the visual editor.\n Now, Traffic Flow introduces a clearer way to craft DNS routing policies for many endpoints and multiple routing methods by moving configurations into a new sidebar, providing an undo/redo button, and introducing a new text editor for changing JavaScript Object Notation (JSON) files right within your browser. The JSON editor also includes syntax highlighting and can be used in conjunction with a new ‘Dark Mode’ theme to show where the policy edits should be made. The new Traffic Flow experience is available globally, except in AWS GovCloud and Amazon Web Services in China. Traffic Flow pricing information can be found here and these enhancements are offered at no additional cost. To learn more about how to use Traffic Flow, visit our documentation or see this blog post.
Amazon S3 Tables integration with SageMaker Lakehouse is now generally available
Amazon S3 Tables now seamlessly integrate with Amazon SageMaker Lakehouse, making it easy to query and join S3 Tables with data in S3 data lakes, Amazon Redshift data warehouses, and third-party data sources. S3 Tables deliver the first cloud object store with built-in Apache Iceberg support. SageMaker Lakehouse is a unified, open, and secure data lakehouse that simplifies your analytics and artificial intelligence (AI). All data in SageMaker Lakehouse can be queried from SageMaker Unified Studio and engines such as Amazon EMR, AWS Glue, Amazon Redshift, Amazon Athena, and Apache Iceberg-compatible engines like Apache Spark or PyIceberg.\n SageMaker Lakehouse provides the flexibility to access and query data in-place across S3 Tables, S3 buckets, and Redshift warehouses using the Apache Iceberg open standard. You can secure and centrally manage your data in the lakehouse by defining fine-grained permissions that are consistently applied across all analytics and ML tools and engines. You can access SageMaker Lakehouse from Amazon SageMaker Unified Studio, a single data and AI development environment that brings together functionality and tools from AWS analytics and AI/ML services. The integrated experience to access S3 Tables with SageMaker Lakehouse is generally available in all AWS Regions where S3 Tables are available. To get started, enable S3 Tables integration with Amazon SageMaker Lakehouse, which allows AWS analytics services to automatically discover and access your S3 Tables data. To learn more about S3 Tables integration, visit the documentation and product page. To learn more about SageMaker Lakehouse, visit the documentation, product page, and read the launch blog.
Amazon RDS for MySQL announces Extended Support minor 5.7.44-RDS.20250213
Amazon Relational Database Service (RDS) for MySQL announces Amazon RDS Extended Support minor version 5.7.44-RDS.20250213. We recommend that you upgrade to this version to fix known security vulnerabilities and bugs in prior versions of MySQL. Learn more about upgrading your database instances, including minor and major version upgrades, in the Amazon RDS User Guide.\n Amazon RDS Extended Support provides you more time, up to three years, to upgrade to a new major version to help you meet your business requirements. During Extended Support, Amazon RDS will provide critical security and bug fixes for your MySQL databases on Aurora and RDS after the community ends support for a major version. You can run your MySQL databases on Amazon RDS with Extended Support for up to three years beyond a major version’s end of standard support date. Learn more about Extended Support in the Amazon RDS User Guide and the Pricing FAQs. Amazon RDS for MySQL makes it simple to set up, operate, and scale MySQL deployments in the cloud. See Amazon RDS for MySQL Pricing for pricing details and regional availability. Create or update a fully managed Amazon RDS database in the Amazon RDS Management Console.
Introducing Amazon EC2 I8g.48xlarge instances in US East (N. Virginia) and US West (Oregon) regions
AWS announces the general availability of one new larger sizes (48xlarge) on Amazon EC2 I8g instances in US East(N. Virginia) and US West(Oregon) regions. The new size expand the I8g portfolio supporting up to 192vCPUs, providing additional compute options to scale-up existing workloads or run larger sized applications that need additional CPU and memory. I8g instances are powered by AWS Graviton4 processors that deliver up to 60% better compute performance compared to previous generation I4g instances. I8g instances use the latest third generation AWS Nitro SSDs, local NVMe storage that deliver up to 65% better real-time storage performance per TB while offering up to 50% lower storage I/O latency and up to 60% lower storage I/O latency variability. These instances are built on the AWS Nitro System, which offloads CPU virtualization, storage, and networking functions to dedicated hardware and software enhancing the performance and security for your workloads.\n I8g instances offer instance sizes up to 48xlarge, 1,536 GiB of memory, and 45 TB instance storage. They are ideal for real-time applications like relational databases, non-relational databases, streaming databases, search queries and data analytic. To learn more, see Amazon I8g instances. To learn how to migrate your workloads to AWS Graviton-based instances, see the Getting started with Graviton. To get started, see the AWS Management Console.
Amazon GuardDuty Malware Protection for S3 now available in AWS GovCloud (US) Regions
Today, Amazon Web Services (AWS) announces the availability of Amazon GuardDuty Malware Protection for Amazon S3 in AWS GovCloud (US) regions. This expansion of GuardDuty Malware Protection allows you to scan newly uploaded objects to Amazon S3 buckets for potential malware, viruses, and other suspicious uploads and take action to isolate them before they are ingested into downstream processes.\n GuardDuty helps customers protect millions of Amazon S3 buckets and AWS accounts. GuardDuty Malware Protection for Amazon S3 is fully managed by AWS, alleviating the operational complexity and overhead that normally comes with managing a data-scanning pipeline, with compute infrastructure operated on your behalf. This feature also gives application owners more control over the security of their organization’s S3 buckets; they can enable GuardDuty Malware Protection for S3 even if core GuardDuty is not enabled in the account. Application owners are automatically notified of the scan results using Amazon EventBridge to build downstream workflows, such as isolation to a quarantine bucket, or define bucket policies using tags that prevent users or applications from accessing certain objects. GuardDuty Malware Protection for Amazon S3 is available in all AWS Regions where GuardDuty is available, excluding China Regions.
AWS Backup adds support for Amazon FSx for OpenZFS in additional AWS Regions
Today, we are announcing the availability of AWS Backup support for Amazon FSx for OpenZFS in 13 additional AWS Regions. AWS Backup is a policy-based, fully managed and cost-effective solution that enables you to centralize and automate data protection of AWS services (spanning compute, storage, and databases) and third-party applications. With this launch, AWS Backup customers can help improve business continuity, disaster recovery, and compliance requirements by protecting Amazon FSx for OpenZFS backups in additional Regions.\n AWS Backup support for Amazon FSx for OpenZFS is added in the following Regions: Africa (Cape Town), Asia Pacific (Hyderabad, Jakarta, Osaka), Europe (Milan, Paris, Spain, Zurich), Israel (Tel Aviv), Middle East (Bahrain, UAE), South America (São Paulo), and US West (N. California). To learn more about AWS Backup for Amazon FSx, visit the AWS Backup product page, technical documentation, and pricing page. For more information on the AWS Backup features available across AWS Regions, see AWS Backup documentation. To get started, visit the AWS Backup console.
AWS Service Reference Information now supports resources and condition keys
Today, AWS is expanding service reference information to include resources and condition keys, providing a more comprehensive view of service permissions. Service reference information streamlines automation of policy management workflows, helping you retrieve available actions across AWS services from machine-readable files. Whether you are a security administrator establishing guardrails for workloads or a developer ensuring appropriate access to applications, you can now more easily identify the available actions, resources, and condition keys for each AWS service.\n You can automate the retrieval of service reference information, eliminating manual effort and ensuring your policies align with the latest service updates. You can also incorporate this service reference directly into your existing policy management tools and processes for a seamless integration. This feature is offered at no additional cost. To get started, refer to the documentation on programmatic service reference information.
Amazon S3 Tables add Apache Iceberg REST Catalog APIs
Amazon S3 Tables now offer table management APIs that are compatible with the Apache Iceberg REST Catalog standard, enabling any Iceberg-compatible application to easily create, update, list, and delete tables in an S3 table bucket.\n These new table management APIs, that map directly to S3 Tables operations, make it easier for you to get started with S3 Tables if you have a custom catalog implementation, need only basic read and write access to tabular data in a single S3 table bucket, or use an APN partner-provided catalog. For unified data management across all of your tabular data, data governance, and fine-grained access controls, you can use S3 Tables with SageMaker Lakehouse. The new table management APIs are available in all AWS Regions where S3 Tables are available, at no additional cost. To learn more about S3 Tables, visit the documentation and product page. To learn more about SageMaker Lakehouse, visit the product page.
Amazon SageMaker Unified Studio is now generally available
AWS announces the general availability of Amazon SageMaker Unified Studio, a single data and AI development environment that brings together functionality and tools from AWS analytics and AI/ML services, including Amazon EMR, AWS Glue, Amazon Athena, Amazon Redshift, Amazon Bedrock, and Amazon SageMaker AI. This launch includes simplified permissions management that makes it easier to bring existing AWS resources to the unified studio. SageMaker Unified Studio allows you to find, access, and query data and AI assets across your organization, then collaborate in projects to securely build and share analytics and AI artifacts, including data, models, and generative AI applications. Unified access to your data is provided by Amazon SageMaker Lakehouse and governance capabilities are built in via Amazon SageMaker Catalog.\n Amazon Q Developer is now generally available in SageMaker Unified Studio, providing generative AI-powered assistance across the development lifecycle. Amazon Q Developer streamlines development by offering natural language, conversational interfaces that simplify tasks like writing SQL queries, building ETL jobs, troubleshooting, and generating real-time code suggestions. The Free Tier of Amazon Q Developer is available by default in SageMaker Unified Studio; customers with existing Amazon Q Developer Pro Tier subscriptions can access additional features.
Selected capabilities from Amazon Bedrock are also generally available in SageMaker Unified Studio. You can rapidly prototype, customize, and share generative AI applications using high-performing foundation models and advanced features such as Amazon Bedrock Knowledge Bases, Amazon Bedrock Guardrails, Amazon Bedrock Agents, and Amazon Bedrock Flows to create tailored solutions aligned to your requirements and responsible AI guidelines.
See Supported Regions for a list of AWS Regions where SageMaker Unified Studio is generally available. To learn more about SageMaker Unified Studio and how it can accelerate data and AI development, see the Amazon SageMaker Unified Studio webpage or documentation. You can start using SageMaker Unified Studio today by selecting “Amazon SageMaker” in the AWS Console.
Amazon S3 Tables add create and query table support in the S3 console
Amazon S3 Tables now support create and query table operations directly from the S3 console using Amazon Athena. With this new feature, you can now create a table, populate it with data, and query it with just a few steps in the S3 console.\n To get started, enable S3 Tables integration with Amazon SageMaker Lakehouse, which allows AWS analytics services to automatically discover and access your S3 Tables data. Then, select a table bucket and select “Create table with Athena”, or select an existing table and select “Query table with Athena”. As the first cloud object store with built-in Apache Iceberg support, S3 Tables offer the easiest way to store tabular data at scale. You can access S3 Tables with AWS analytics services through the now generally available SageMaker Lakehouse integration, as well as Apache Iceberg-compatible open source engines like Apache Spark and Apache Flink. This support is available in all AWS Regions where S3 Tables are available. To learn more about S3 Tables, visit the product page and documentation. To learn more about the integration between S3 Tables and SageMaker Lakehouse, read the AWS News Blog.
AWS Amplify Hosting announces deployment skew protection support
AWS Amplify Hosting is excited to offer Skew Protection, a powerful feature that guarantees version consistency across your deployments. This feature ensures frontend requests are always routed to the correct server backend version—eliminating version skew and making deployments more reliable.\n You can enable this feature at the branch level in the Amplify Console under App Settings → Branch Settings. There is no additional cost associated with this feature and it is available to all customers. This feature is available in all 20 AWS Amplify Hosting regions: US East (Ohio), US East (N. Virginia), US West (N. California), US West (Oregon), Asia Pacific (Hong Kong), Asia Pacific (Tokyo), Asia Pacific (Osaka) Asia Pacific (Seoul), Asia Pacific (Mumbai), Asia Pacific (Singapore), Asia Pacific (Sydney), Canada (Central), Europe (Frankfurt), Europe (Stockholm), Europe (Milan), Europe (Ireland), Europe (London), Europe (Paris), Middle East (Bahrain) and South America (São Paulo). To get started, check out our blog post. Or read the documentation.
Amazon QuickSight now available in the Europe (Spain) Region
Amazon QuickSight a fast, scalable, and fully managed Business Intelligence service that lets you easily create and publish interactive dashboards across your organization is now available in Spain Region. QuickSight dashboards can be authored on any modern web browser with no clients to install or manage; dashboards can be shared with 10s of 1000s of users without the need to provision or manage any infrastructure. QuickSight dashboards can also be seamlessly embedded into your applications, portals, and websites to provide rich, interactive analytics for end-users.\n With this launch, QuickSight expands to 23 regions, including: US East (Ohio and N. Virginia), US West (Oregon), Europe (Spain, Stockholm, Paris, Frankfurt, Ireland, London, Milan and Zurich), Asia Pacific (Mumbai, Seoul, Singapore, Sydney, Beijing, Tokyo and Jakarta), Canada (Central), South America (São Paulo), Africa (Cape Town) and AWS GovCloud (US-East, US-West). To learn more about Amazon QuickSight, please see our product page, documentation and available regions here.
AWS Backup adds logically air-gapped vault support for Amazon FSx
Today, we are announcing the availability of AWS Backup logically air-gapped vault support for Amazon FSx for Lustre, Amazon FSx for Windows File Server, and Amazon FSx for OpenZFS. Logically air-gapped vault is a type of AWS Backup vault that allows secure sharing of backups across accounts and organizations, supporting direct restore to reduce recovery time from a data loss event. A logically air-gapped vault stores immutable backup copies that are locked by default, and isolated with encryption using AWS owned keys.\n You can now protect your Amazon FSx file system in logically air-gapped vaults in either the same account or across other accounts and Regions. This helps reduce the risk of downtime, ensure business continuity, and meet compliance and disaster recovery requirements. You can get started using the AWS Backup console, API, or CLI. Target Amazon FSx backups to a logically air-gapped vault by specifying it as a copy destination in your backup plan. Share the vault for recovery or restore testing with other accounts using AWS Resource Access Manager (RAM). Once shared, you can initiate direct restore jobs from that account, eliminating the overhead of copying backups first. AWS Backup support for the three Amazon FSx file systems is available in all the Regions where logically air-gapped vault and respective Amazon FSx file systems are supported. For more information, visit the AWS Backup product page, and documentation.
Amazon Bedrock’s capabilities now generally available within Amazon SageMaker Unified Studio
Amazon Bedrock’s capabilities are now generally available within Amazon SageMaker Unified Studio, offering a governed collaborative environment that empowers developers to rapidly create and customize generative AI applications. This intuitive interface caters to developers of all skill levels, providing seamless access to Amazon Bedrock’s high-performance foundation models (FMs) and advanced customization tools for collaborative development of tailored generative AI applications.\n Amazon Bedrock can be accessed through the AWS Management Console, APIs, or Amazon SageMaker Unified Studio. Its integration in Amazon SageMaker Unified Studio eliminates barriers between data, tools, and developers in the generative AI development process. Teams gain a unified development experience by accessing familiar JupyterLab environments and analytics tools while seamlessly incorporating Amazon Bedrock’s powerful generative AI capabilities—all within the same workspace. Developers can harness Retrieval Augmented Generation (RAG) to build Knowledge Bases from proprietary data sources, utilize Agents and Flows for complex task automation, and implement Guardrails for responsible AI development. This consolidated workspace streamlines complexity, enabling faster prototyping, iteration, and deployment of production-ready, responsible generative AI applications that align with specific business requirements. Amazon Bedrock in SageMaker Unified Studio can now be accessed in all 12 regions where SageMaker Unified Studio is available, including in Europe, South America, Asia Pacific, US East, US West. For more information on supported regions, please refer to the Amazon SageMaker Unified Studio regions guide. Learn more about Amazon Bedrock’s capabilities in Amazon SageMaker Unified Studio by visiting the capability page, and get started by enabling a “Generative AI application development” project profile using this admin guide.
AWS Blogs
AWS Japan Blog (Japanese)
AWS News Blog
- Collaborate and build faster with Amazon SageMaker Unified Studio, now generally available
- Amazon S3 Tables integration with Amazon SageMaker Lakehouse is now generally available
AWS Big Data Blog
- Accelerate analytics and AI innovation with the next generation of Amazon SageMaker
- Announcing end-of-support for Amazon Kinesis Client Library 1.x and Amazon Kinesis Producer Library 0.x effective January 30, 2026
- Deploy real-time analytics with StarTree for managed Apache Pinot on AWS
Containers
AWS Database Blog
- Multiple database support on Amazon RDS for Db2 DB instance
- Build resilient Oracle Database workloads on Amazon EC2
AWS DevOps & Developer Productivity Blog
Front-End Web & Mobile
- Building real-time apps with AWS AppSync Events’ WebSocket publishing
- Amplify Hosting Announces Skew Protection Support
AWS for Industries
- Software-defined Vehicles, GenAI, IoT – The Path to AI-Defined Vehicles
- SCADA Disaster Recovery on AWS for Inductive Automation’s Ignition
AWS Machine Learning Blog
- How GoDaddy built a category generation system at scale with batch inference for Amazon Bedrock
- Benchmarking customized models on Amazon Bedrock using LLMPerf and LiteLLM
- Creating asynchronous AI agents with Amazon Bedrock
- How to run Qwen 2.5 on AWS AI chips using Hugging Face libraries
- Revolutionizing customer service: MaestroQA’s integration with Amazon Bedrock for actionable insight
- Optimize hosting DeepSeek-R1 distilled models with Hugging Face TGI on Amazon SageMaker AI
AWS Messaging & Targeting Blog
AWS Security Blog
- Secure cloud innovation starts at re:Inforce 2025
- Manage authorization within a containerized workload using Amazon Verified Permissions