Introducing Falcon 2 11B: The Next Generation Model from TII Available on Amazon SageMaker JumpStart
With the release of the Falcon 2 11B model from Technology Innovation Institute (TII), AI practitioners now have access to a powerful large language model (LLM) trained on a massive dataset. This model supports multiple languages and offers a range of capabilities, making it a valuable tool for various AI tasks.
The Falcon 2 11B model is now available on Amazon SageMaker JumpStart, a machine learning hub that provides easy access to pre-built models and solutions. In this blog post, we’ll take a closer look at the Falcon 2 11B model, how to deploy it using SageMaker JumpStart, and some example use cases.
### Introducing Falcon 2 11B
The Falcon 2 11B model is the first in TII’s new Falcon 2 series. It is a trained dense decoder model on a 5.5 trillion token dataset, making it a powerful tool for auto-regressive tasks. With multilingual capabilities, Falcon 2 11B can handle tasks in languages like English, French, Spanish, German, and more.
Supported by the SageMaker TGI Deep Learning Container, Falcon 2 11B is designed for high-performance text generation using tensor parallelism and dynamic batching. The model is available under the permissive Apache 2.0-based TII Falcon License 2.0, promoting responsible use of AI technologies.
### Using SageMaker JumpStart
SageMaker JumpStart offers a convenient way to discover, deploy, and run inference on the Falcon 2 11B model. Through the SageMaker Studio UI or the Python SDK, users can easily access and deploy the model. With just a few clicks, the Falcon 2 11B model can be deployed to a SageMaker instance, allowing for quick and efficient inference.
Users can choose the Falcon 2 11B model from the SageMaker JumpStart landing page, view model details, and deploy the model with ease. The model is available for inferencing from 22 AWS regions where SageMaker JumpStart is available and requires g5 and p4 instances for deployment.
### Example Use Cases
To showcase the capabilities of the Falcon 2 11B model, we provided examples of prompts and sample outputs in various scenarios:
1. **Text generation:** Generating text based on prompts like building a website in simple steps.
2. **Code generation:** Generating Python code for writing a JSON file.
3. **Sentiment analysis:** Analyzing sentiment from input text like tweets.
4. **Question answering:** Answering questions based on input prompts.
5. **Multilingual capabilities:** Interacting with the model in multiple languages.
6. **Mathematics and reasoning:** Solving math problems and explaining the solution.
### Conclusion
The Falcon 2 11B model offers a range of capabilities for AI practitioners, and deploying it through SageMaker JumpStart makes the process easy and efficient. With the ability to handle various tasks across different languages, the model can be a valuable tool for a wide range of applications.
To get started with Falcon 2 11B and explore its full potential, visit SageMaker JumpStart in SageMaker Studio. For more information, refer to the resources mentioned in this blog post.