skip to Main Content

The joy of not managing your own AI infrastructure in today’s world

Anyone paying attention to the current state of AI is aware of the exploding need for compute resources. Many of today’s engineers have now discovered the wonderful land of not managing your own infrastructure. There is a growing number of companies that now enable you to run your models or other compute-intensive workloads on their managed cloud by offering “serverless” GPUs and other performant compute options.
Read more

Open-sourcing the Scale chatbot

Applications that enable a natural language interface between people and data will be at the forefront of LLM enterprise adoption. At Scale, we launched a chatbot to increase access to information about our firm, but we also built it ourselves to gain an understanding of the solutions space. We've just published a blog post where we explore the architecture and share conclusions from this experience. We're also making the full source code available and free to use.
Read more
Back To Top