NAPA, Calif. – At the Linux Foundation Member Summit, Linux Foundation Research announced a new report that dives deep into the sometimes awkward relationship between open source and artificial intelligence (AI).
Called “Shaping the Future of Generative AI,” the report — produced by Linux Foundation AI & Data and the Cloud Native Computing Foundation (CNCF) — reveals that open-source software is crucial in shaping the future of generative AI. That much we already knew! Indeed, AI can’t exist without open-source programs such as PyTorch and TensorFlow. What this report, which surveyed 316 AI professionals, brings to the table is an analysis of open source and Gen AI’s significant new trends.
Also: The best open-source AI models: All your free-to-use options explained
Would you be surprised to know that 94% of organizations currently use Gen AI, with 42% reporting high or very high adoption rates?
On average, 41% of those using Gen AI report that their organization’s code infrastructure supporting AI is open source. This percentage rises to 47% for high adopters of Gen AI. It’s far higher than that when you look behind the curtain and see how machine learning generates large language models (LLM) in the first place.
Why? Nearly half (46%) of organizations cited cost efficiency for choosing open-source Gen AI solutions. Open-source tools reduce both upfront licensing costs and long-term dependence on proprietary vendors. For example, AI application frameworks like LangChain and LlamaIndex enable organizations to develop and deploy AI models at a fraction of the cost of proprietary solutions.
<!–>
That can amount to some serious savings. BloombergGPT, Bloomberg’s 50-billion parameter finance LLM, cost around $3 million to build.
Another cost savings can come from the rise of using cloud native technologies to run scalable GenAI programs. Kubernetes, for instance, has emerged as a key enabler for orchestrating scalable Gen AI workloads, with 50% of organizations using it to host some or all of their Gen AI inferencing workloads.