A marketplace to help developers access open-source LLMs
Motley is a federated API that allows developers to access a wide range of large language models (LLMs). It also allows them to monetize their server infrastructure by hosting LLMs for others.
LLMs are constantly improving, and nowadays, it seems new state-of-the-art models are released weekly. Different models have different strengths, and it's becoming more and more common to use multiple models across a single application.
At the same time, the requirements for hosting LLM are going down, it's now possible to host a fairly good LLM on a Macbook pro or an iPhone.
By using Motley you always get access to the best model for the job and can monetize your hardware.
Motley operates through a federated API that works like an API gateway, connecting users to a variety of large language models hosted by third parties. Here's how it works:
With Motley, you can monetize your LLMs and access the best models for your needs, all through a robust, reliable API.
Follow these steps to get started:
Follow these steps to host your LLM:
How can I join the program?
Email benjamin.shafii@gmail.com or hi@louis030195.com for access to the program.
How do Motley ensure the quality and accuracy of the hosted language models?
Hosted LLMs are regularly polled by Motley's back-end to measure uptime, latency, quality, performance and security, using industry-standard techniques to ensure a great, openai-api-like experience for developers
Furthermore, hosts are ranked based on these various metrics to assess the quality of their hosting. This incentivizes them to provide high quality and trusted services to developers. Developers can in turn choose from the top-ranked models that meet their budget and requirements.
Motley allows developers to gather feedback from users who utilize their hosted models. This feedback loop helps developers identify areas for improvement and refine their models over time.
Are there any infrastructural requirements for hosting these models?
Until now, hosting LLMs has required considerable computational resources. With the emergence of techniques that can turn large models into smaller ones, without loss of performance, LLMs can now be run both on the gpu and at the edge, even on your phone.
Hosting language models still requires a stable and reliable network infrastructure, with sufficient bandwidth to handle incoming requests and serve responses in a timely manner.
And how does Motley facilitate the monetization process for developers?
Motley provides a comprehensive dashboard that allows developers to track the usage of their hosted language models. This helps developers understand the popularity and consumption patterns of their models, enabling them to optimize their monetization strategies.
Motley integrates billing and payment systems, making it easy for developers to set pricing, manage subscriptions, and receive payments from users who access and utilize their hosted models.
Motley provides a platform for developers to showcase their hosted language models, making them discoverable to a wider audience. This increases the monetization potential by attracting more users to utilize the models.