7/15/2025, 12:00:00 AM ~ 7/16/2025, 12:00:00 AM (UTC)
Recent Announcements
Amazon SageMaker introduces a visual workflows builder
Amazon SageMaker now offers a visual builder experience for creating and managing workflows. This feature is part of the next generation of Amazon SageMaker - the center for all your data, analytics, and AI, and is available within SageMaker Unified Studio, a single data and AI development environment. Visual workflows in Amazon SageMaker provides a drag-and-drop interface for building workflows and simplifies authoring and scheduling processes for data engineers and data scientists.\n With visual workflows, you now have a visual, low-code way to represent a series of tasks, such as a data processing job that loads information into a table followed by a notebook that analyzes the loaded data. You can quickly get started by authoring your workflow visually and then continue to customize with code, if desired. Amazon SageMaker uses Amazon Managed Workflows for Apache Airflow (MWAA) to run workflows. With this new feature, you can create and modify workflows, view and adjust workflow schedules, pause or resume schedules as needed, and monitor the status and logs of workflow runs. This visual approach simplifies workflow management and provides greater ease-of-use for your data processes. This new feature is now available in all AWS regions where Amazon SageMaker is available. Access the supported region list for the most up-to-date availability information. To learn more, visit our Amazon SageMaker Unified Studio documentation, blog post and MWAA pricing page.
Amazon Redshift now supports automatic refresh of materialized views that are defined on external Apache Iceberg tables in the Amazon S3 data lake.\n With this update, Amazon Redshift will automatically refresh a materialized view defined on Apache Iceberg tables, that reside in the Amazon S3 bucket of the Amazon data lake, when there is new data is added or deleted to the Apache Iceberg tables.
You can start using this new capability immediately to build more complex and flexible analytics pipelines in the Amazon S3 data lake. To learn more about automatic refresh of materialized views and to get started with using materialized views in Amazon Redshift, refer to the Autorefreshing a materialized view sub-section of the Refreshing a materialized view section of the Amazon Redshift Materialized Views documentation.
Amazon EC2 I8g instances now available in additional AWS regions
AWS is announcing the general availability of Amazon EC2 Storage Optimized I8g instances in Europe (Spain), Asia Pacific (Sydney), Canada (Central), and Asia Pacific (Mumbai) regions. I8g instances offer the best performance in Amazon EC2 for storage-intensive workloads. I8g instances are powered by AWS Graviton4 processors that deliver up to 60% better compute performance compared to previous generation I4g instances. I8g instances use the latest third generation AWS Nitro SSDs, local NVMe storage that deliver up to 65% better real-time storage performance per TB while offering up to 50% lower storage I/O latency and up to 60% lower storage I/O latency variability. These instances are built on the AWS Nitro System, which offloads CPU virtualization, storage, and networking functions to dedicated hardware and software enhancing the performance and security for your workloads.\n Amazon EC2 I8g instances are designed for I/O intensive workloads that require rapid data access and real-time latency from storage. These instances excel at handling transactional, real-time, distributed databases, including MySQL, PostgreSQL, Hbase and NoSQL solutions like Aerospike, MongoDB, ClickHouse, and Apache Druid. They’re also optimized for real-time analytics platforms such as Apache Spark, data lakehouse and AI LLM pre-processing for training. I8g instances are available in 10 different sizes with up to 48xlarge including one metal size, 1.5 TiB of memory, and 45 TB local instance storage. They deliver up to 100 Gbps of network performance bandwidth, and 60 Gbps of dedicated bandwidth for Amazon Elastic Block Store (EBS). To learn more, visit EC2 I8g instances.
Announcing support for Chaos V-Ray in AWS Deadline Cloud
AWS Deadline Cloud now supports usage-based licensing (UBL) for Chaos V-Ray so you can seamlessly leverage the cloud to render and access flexible V-Ray licensing. Deadline Cloud is a fully managed service that simplifies render management for teams creating computer-generated graphics and visual effects for films, television, broadcasting, web content, and design.\n With AWS Deadline Cloud, you can submit submit and run standalone V-Ray job bundles as well as submit V-Ray jobs from within Autodesk Maya without having to manage your own render farm infrastructure. You can now scale V-Ray rendering workloads effortlessly, eliminating bottlenecks caused by local resource limitations. UBL integration offers a pay-as-you-go licensing model, ideal for studios managing dynamic workloads. You can build pipelines for 3D graphics and visual effects without having to set up, configure, or manage the worker infrastructure yourself. Creative teams can get started with V-Ray usage-based licensing today in all AWS Regions where Deadline Cloud is available. For more information, please visit the Deadline Cloud product page, and see the Deadline Cloud pricing page for UBL price details.
Amazon S3 Inventory ACL support is now available in the AWS GovCloud (US) Regions
Amazon S3 Inventory’s capability to include access control lists (ACLs) as object metadata in inventory reports is now available in AWS GovCloud (US) Regions. This allows you to easily review ACLs on all of your objects to simplify review of access permissions. ACLs were the original way to manage object access when S3 launched in 2006. Now, when migrating to IAM-based bucket policies for access control, you can easily review all of the object ACLs in your buckets before enabling S3 Object Ownership.\n S3 Inventory provides a complete list of objects in a bucket and their corresponding metadata. The Object ACLs fields include details about the object owner and the grantee along with their permission granted. You can now activate reporting on object ACLs by editing existing S3 Inventory configurations in the AWS Management Console or API. By enabling S3 Object Ownership, you can change how S3 performs access control for a bucket so that only IAM policies are used. S3 Object Ownership’s ‘Bucket owner enforced’ setting disables ACLs for your bucket and the objects in it, and updates every object so that each object is owned by the bucket owner. We recommend that you carefully review your use of ACLs with inventory reports, migrate to IAM-based bucket policies, and then disable ACLs with S3 Object Ownership. For more information, see Controlling ownership of objects and disabling ACLs for your bucket. Amazon S3 Inventory support for Object ACL is generally available at no additional charge in all AWS Commercial and AWS GovCloud (US) Regions, where Amazon S3 Inventory is available. To learn more, please visit Amazon S3 Inventory and Amazon S3 pricing.
Amazon EC2 M6id instances are now available in Europe (Spain) region
Starting today, Amazon Elastic Compute Cloud (Amazon EC2) M6id instances are available in Europe (Spain) Region. These instances are powered by 3rd generation Intel Xeon Scalable Ice Lake processors with an all-core turbo frequency of 3.5 GHz and up to 7.6 TB of local NVMe-based SSD block-level storage.\n M6id instances are built on AWS Nitro System, a combination of dedicated hardware and lightweight hypervisor, which delivers practically all of the compute and memory resources of the host hardware to your instances for better overall performance and security. Customers can take advantage of access to high-speed, low-latency local storage to scale performance of applications such data logging, distributed web-scale in-memory caches, in-memory databases, and real-time big data analytics. Customers can purchase the new instances via Savings Plans, Reserved, On-Demand, and Spot instances. To get started, visit AWS Command Line Interface (CLI), and AWS SDKs. To learn more, visit our product page for M6id.
Amazon RDS Custom for SQL Server now supports change data capture
Amazon RDS Custom for SQL Server now supports Change Data Capture (CDC). In Microsoft SQL Server, CDC is a feature that utilizes the SQL Server Agent to log insertions, updates, and deletions occurring in a database table, and makes these data changes accessible to applications in a relational format to provide historical context.\n Amazon RDS Custom for SQL Server is a managed database service that allows customers to bring-your-own-license (BYOL) for Microsoft SQL Server. When you configure CDC in Amazon RDS Custom for SQL Server, RDS automatically applies your specified configurations when the underlying hardware needs to be replaced, or when you scale up to use a larger database instance. Thus, RDS Custom for SQL Server removes the operational overhead of setting up and verifying configurations in different operational situations. CDC is available in all AWS Regions where RDS Custom for SQL Server is available.
AWS Blogs
AWS Japan Blog (Japanese)
- Code with Kiro Hackathon announcements
- Generative AI is spreading even among small and medium enterprises. Contributes to corporate growth
- Resilience at AWS as seen at AWS Summit Japan 2025
- Accelerate SAP application development with Amazon Q Developer
- Introducing Kiro — the new Agentic IDE that works with you from prototype to production
- Prioritize security risks using exposure detection results on the AWS Security Hub
AWS News Blog
- Streamline the path from data to insights with new Amazon SageMaker Catalog capabilities
- AWS Free Tier update: New customers can get started and explore AWS with up to $200 in credits
- Monitor and debug event-driven applications with new Amazon EventBridge logging
- Introducing Amazon S3 Vectors: First cloud storage with native vector support at scale (preview)
- Amazon S3 Metadata now supports metadata for all your S3 objects
- TwelveLabs video understanding models are now available in Amazon Bedrock
AWS Open Source Blog
AWS Big Data Blog
- Introducing Jobs in Amazon SageMaker
- Orchestrate data processing jobs, querybooks, and notebooks using visual workflow experience in Amazon SageMaker
- Revenue NSW modernises analytics with AWS, enabling unified and scalable data management, processing, and access
AWS Database Blog
AWS DevOps & Developer Productivity Blog
AWS HPC Blog
AWS for Industries
Artificial Intelligence
- Amazon Bedrock Knowledge Bases now supports Amazon OpenSearch Service Managed Cluster as vector store
- Monitor agents built on Amazon Bedrock with Datadog LLM Observability
- How PayU built a secure enterprise AI assistant using Amazon Bedrock
- Supercharge generative AI workflows with NVIDIA DGX Cloud on AWS and Amazon Bedrock Custom Model Import
- Accelerate generative AI inference with NVIDIA Dynamo and Amazon EKS
- AWS doubles investment in AWS Generative AI Innovation Center, marking two years of customer success