February 7, 2025 – In a recent conference call with analysts, Amazon CEO Andy Jassy shared his predictions for the future of artificial intelligence, specifically pointing to a significant drop in the cost of AI inference. This reduction, he believes, will pave the way for businesses to more easily integrate other applications with reasoning and generative AI.
When queried about whether the recent technological breakthroughs by DeepSeek could optimize AI costs, Jassy echoed the sentiments of other tech industry leaders. In his view, such advancements are likely to drive up the overall demand for artificial intelligence, rather than just reducing costs.
“The news you’ve heard about DeepSeek in the past few weeks is part of this larger trend,” he added. Amazon, impressed by DeepSeek’s AI models, has decided to swiftly introduce them into its Bedrock and SageMaker platforms. Jassy emphasized that Amazon is committed to “providing customers with as many cutting-edge models as possible for their selection.”

Furthermore, Jassy clarified that a decrease in the cost of technologies like AI inference doesn’t necessarily mean businesses will curtail their tech investments. On the contrary, he argued that cost reductions enable companies to pursue innovative projects that were previously shelved due to budgetary constraints, potentially increasing overall technology expenditure.
Previously, it was reported that from January 30th, users could deploy the DeepSeek-R1 model through Amazon Bedrock and Amazon SageMaker AI. Amazon Bedrock is suited for teams seeking to quickly integrate pre-trained models via APIs, while Amazon SageMaker AI caters to teams requiring advanced customization, training, deployment, and infrastructure invocation.
Additionally, users have the option to deploy DeepSeek-R1-Distill more cost-effectively using Amazon EC2, Amazon SageMaker AI, Amazon Trainium, and Amazon Inferentia.