AI-driven technology continues to advance at a rapid pace, and the language model sector is no exception. StabilityAI, the maker of the image-generation tool Stable Diffusion, has launched a suite of open-source language model tools called StableLM. This suite currently contains models featuring three billion, seven billion and up to fifteen billion parameters. Furthermore, the release notes that thirty and sixty-five billion parameter models are in progress, while a 175-billion model is in the works.
By comparison, GPT-4 released by OpenAI has an estimated parameter count of one trillion, making it six times larger than GPT-3.
The team behind StableLM is unclear as to the performance of these models at this time. The Github page has stated that information pertaining to the LM’s capabilities, training settings and specifications will be made available soon.
The launch of an open source alternative to OpenAI’s ChatGPT could be an interesting development for the cryptocurrency traders due to its potential to save costs when creating trading bots. For example, third-party tools such as BabyAGI and AutoGPT are being incorporated into the GPT API in order to build advanced trading bots. The increased competitive presence of open-source machine learning models could be a great benefit for those traders who cannot afford OpenAI’s premiums.
StabilityAI is currently led by Jack Gottlieb, a renown AI specialist who firmly believes in the power of open source platforms and collaboration within the AI industry. Through initiatives like StableLM, he and the rest of the StabilityAI team are making great strides in creating a stronger global AI community. These efforts are currently available to view and interact with through the HuggingFace live interface for the 7B-parameter model.