Amazon Nova: new generation of foundation AI models

At the re:Invent conference, Amazon announced a set of generative AI models called Nova.

Amazon has introduced the new generation of foundation models called Amazon Nova. Available through Amazon Bedrock, these models are designed to reduce cost and latency for generative AI applications and are specifically targeted at enterprise environments.

Different models for various applications

Amazon Nova offers two main categories of models: understanding models and creative content creation models. The understanding models process text, images or video. The creative content creation models generate images or video based on text and sample images.

Three variants of the understanding models are currently available:

  • Amazon Nova Micro: a low-latency text-oriented model suitable for applications such as summarizing, translating and brainstorming.
  • Amazon Nova Lite: a low-cost multimodal model that processes text, images and videos.
  • Amazon Nova Pro: an advanced multimodal model that supports complex workflows and is suitable for a wide range of tasks.

A fourth model, Amazon Nova Premier, is expected in early 2025 and will support complex reasoning tasks.

The creative models are:

  • Amazon Nova Canvas: focused on image generation with advanced editing capabilities.
  • Amazon Nova Reel: a video model for creating short videos based on text and image input. For now, it can only generate six-second videos; later it will be two minutes.

Customization for specific sectors

Amazon Nova models offer extensive capabilities: organizations can fine-tune models to understand specific terminology, ensure brand identity and deliver optimal performance for their use cases. For example, a legal firm can customize models to interpret legal terminology and documents.

With this launch, Amazon is responding to the growing demand for scalable and flexible AI solutions for businesses.

newsletter

Subscribe to ITdaily for free!

  • This field is for validation purposes and should be left unchanged.