The launch is based on “Amazon Nova,” the company’s new generation of foundation models, the company said in a blog post. Data scientists use foundation models (FMs) to develop machine learning (ML) platforms more quickly than starting from scratch, allowing them to create artificial intelligence applications capable of performing a wide variety of general tasks, since they were trained on a broad spectrum of generalized data, Amazon says.
The new models are integrated with Amazon Bedrock, a managed service that makes FMs from AI companies and Amazon available for use through a single API. Using Amazon Bedrock, customers can experiment with and evaluate Amazon Nova models, as well as other FMs, to determine the best model for an application.
Calling the launch “the next step in our AI journey,” the company says Amazon Nova has the ability to process text, image, and video as prompts, so customers can use Amazon Nova-powered generative AI applications to understand videos, charts, and documents, or to generate videos and other multimedia content.
“Inside Amazon, we have about 1,000 Gen AI applications in motion, and we’ve had a bird’s-eye view of what application builders are still grappling with,” Rohit Prasad, SVP of Amazon Artificial General Intelligence, said in a release. “Our new Amazon Nova models are intended to help with these challenges for internal and external builders, and provide compelling intelligence and content generation while also delivering meaningful progress on latency, cost-effectiveness, customization, information grounding, and agentic capabilities.”
The new Amazon Nova models available in Amazon Bedrock include:
Amazon Nova Micro, a text-only model that delivers the lowest latency responses at very low cost.
Amazon Nova Lite, a very low-cost multimodal model that is lightning fast for processing image, video, and text inputs.
Amazon Nova Pro, a highly capable multimodal model with the best combination of accuracy, speed, and cost for a wide range of tasks.
Amazon Nova Premier, the most capable of Amazon’s multimodal models for complex reasoning tasks and for use as the best teacher for distilling custom models
Amazon Nova Canvas, a state-of-the-art image generation model.
Amazon Nova Reel, a state-of-the-art video generation model that can transform a single image input into a brief video with the prompt: dolly forward.
Many AI deployments are getting stuck in the planning stages due to a lack of AI skills, governance issues, and insufficient resources, leading 61% of global businesses to scale back their AI investments, according to a study from the analytics and AI provider Qlik.
Philadelphia-based Qlik found a disconnect in the market where 88% of senior decision makers say they feel AI is absolutely essential or very important to achieving success. Despite that support, multiple factors are slowing down or totally blocking those AI projects: a lack of skills to develop AI [23%] or to roll out AI once it’s developed [22%], data governance challenges [23%], budget constraints [21%], and a lack of trusted data for AI to work with [21%].
The numbers come from a survey of 4,200 C-Suite executives and AI decision makers, revealing what is hindering AI progress globally and how to overcome these barriers.
Respondents also said that many stakeholders lack trust in AI technology generally, which holds those projects back. Over a third [37%] of AI decision makers say their senior managers lack trust in AI, 42% feel less senior employees don’t trust the technology., and a fifth [21%] believe their customers don’t trust AI either.
“Business leaders know the value of AI, but they face a multitude of barriers that prevent them from moving from proof of concept to value creating deployment of the technology,” James Fisher, Chief Strategy Officer at Qlik, said in a release. “The first step to creating an AI strategy is to identify a clear use case, with defined goals and measures of success, and use this to identify the skills, resources and data needed to support it at scale. In doing so you start to build trust and win management buy-in to help you succeed.”
The “D&B Ask Procurement” product works by synthesizing vast datasets and providing intelligent recommendations, according to Dun & Bradstreet, which calls itself a provider of business decisioning data and analytics.
It was built with IBM’s watsonx Orchestrate and watsonx.ai technology with support from IBM Consulting, and connects to Dun & Bradstreet’s business risk, financial, and firmographic data and insights. It then uses a conversational chat interface to provide advanced reasoning capabilities and autonomous decision making, helping teams to query critical supplier insights, expedite analysis and reporting, and identify suppliers for engagement, the partners said.
“One key point of entry for Gen AI adoption is AI assistants, and together IBM and Dun & Bradstreet are collaborating to bring clients new innovations within the procurement domain,” Parul Mishra, Vice President of Product Management, Digital Labor at IBM, said in a release. “With D&B Ask Procurement, an AI assistant built on the foundation of watsonx Orchestrate, users can seamlessly complete tasks and automate complex processes with natural language, helping drive efficiency, cost-savings and higher productivity.”
The San Francisco tech startup Vooma has raised $16 million in venture funding for its artificial intelligence (AI) platform designed for freight brokers and carriers, the company said today.
The backing came from a $13 million boost in “series A” funding led by Craft Ventures, which followed an earlier seed round of $3.6 million led by Index Ventures with participation from angel investors including founders and executives from major logistics and technology companies such as Motive, Project44, Ryder, and Uber Freight.
Founded in 2023, the firm has built “Vooma Agents,” which it calls a multi-channel AI platform for logistics. The system uses various agents to operate across email, text and voice channels, allowing for automation in workflows that were previously unaddressable by existing systems. According to Vooma, its platform lets logistics companies scale up their operations by reducing time spent on tedious and manual work and creating space to solve real logistical challenges, while also investing in critical relationships.
The company’s solutions include: Vooma Quote, which identifies quotes and drafts email responses, Vooma Build, a data-entry assistant for load building, and Vooma Voice, which can make and receive calls for brokers and carriers. Additional options are: Vooma Insights and the future releases of Vooma Agent and Vooma Schedule.
“The United States moves approximately 11.5 billion tons of truckloads annually, and moving freight from point A to B requires hundreds of touchpoints between shippers, brokers and carriers,” Vooma co-founder, who is the former CEO of ASG LogisTech, said in a release. “By introducing AI that fits naturally into existing systems, workflows and communication channels used across the industry, we are meaningfully reducing the tasks people dislike and freeing up their time and headspace for more meaningful and complex challenges.”
The new funding brings Amazon's total investment in Anthropic to $8 billion, while maintaining the e-commerce giant’s position as a minority investor, according to Anthropic. The partnership was launched in 2023, when Amazon invested its first $4 billion round in the firm.
Anthropic’s “Claude” family of AI assistant models is available on AWS’s Amazon Bedrock, which is a cloud-based managed service that lets companies build specialized generative AI applications by choosing from an array of foundation models (FMs) developed by AI providers like AI21 Labs, Anthropic, Cohere, Meta, Mistral AI, Stability AI, and Amazon itself.
According to Amazon, tens of thousands of customers, from startups to enterprises and government institutions, are currently running their generative AI workloads using Anthropic’s models in the AWS cloud. Those GenAI tools are powering tasks such as customer service chatbots, coding assistants, translation applications, drug discovery, engineering design, and complex business processes.
"The response from AWS customers who are developing generative AI applications powered by Anthropic in Amazon Bedrock has been remarkable," Matt Garman, AWS CEO, said in a release. "By continuing to deploy Anthropic models in Amazon Bedrock and collaborating with Anthropic on the development of our custom Trainium chips, we’ll keep pushing the boundaries of what customers can achieve with generative AI technologies. We’ve been impressed by Anthropic’s pace of innovation and commitment to responsible development of generative AI, and look forward to deepening our collaboration."
A growing number of organizations are identifying ways to use GenAI to streamline their operations and accelerate innovation, using that new automation and efficiency to cut costs, carry out tasks faster and more accurately, and foster the creation of new products and services for additional revenue streams. That was the conclusion from ISG’s “2024 ISG Provider Lens global Generative AI Services” report.
The most rapid development of enterprise GenAI projects today is happening on text-based applications, primarily due to relatively simple interfaces, rapid ROI, and broad usefulness. Companies have been especially aggressive in implementing chatbots powered by large language models (LLMs), which can provide personalized assistance, customer support, and automated communication on a massive scale, ISG said.
However, most organizations have yet to tap GenAI’s potential for applications based on images, audio, video and data, the report says. Multimodal GenAI is still evolving toward mainstream adoption, but use cases are rapidly emerging, and with ongoing advances in neural networks and deep learning, they are expected to become highly integrated and sophisticated soon.
Future GenAI projects will also be more customized, as the sector sees a major shift from fine-tuning of LLMs to smaller models that serve specific industries, such as healthcare, finance, and manufacturing, ISG says. Enterprises and service providers increasingly recognize that customized, domain-specific AI models offer significant advantages in terms of cost, scalability, and performance. Customized GenAI can also deliver on demands like the need for privacy and security, specialization of tasks, and integration of AI into existing operations.