OSI is pleased to see that Meta is lowering barriers for access to powerful AI systems. Unfortunately, the tech giant has created the misunderstanding that LLaMa 2 is “open source” – it is not. Even assuming the term can be validly applied to a large language model comprising several resources of different kinds, Meta is confusing “open source” with “resources available to some users under some conditions,” two very different things. We’ve asked them to correct their misstatement.
“Open Source” means software under a license with specific characteristics, defined by the Open Source Definition (OSD). Among other requirements, for a license to be Open Source, it may not discriminate against persons or groups or fields of endeavor (OSD points 5 and 6). Meta’s license for the LLaMa models and code does not meet this standard; specifically, it puts restrictions on commercial use for some users (paragraph 2) and also restricts the use of the model and software for certain purposes (the Acceptable Use Policy).
Why Open Source matters
An Open Source license ensures that developers and users are able to decide for themselves how and where to use the technology without the need to engage with another party; they have sovereignty over the technology they use. Open Source is premised on the understanding that everyone gets to share no matter who you are. The commercial limitation in paragraph 2 of LLAMA COMMUNITY LICENSE AGREEMENT is contrary to that promise in the OSD.
OSI does not question Meta’s desire to limit the use of Llama for competitive purposes, but doing so takes the license out of the category of “Open Source.”
The OSD does not allow restrictions on field of use because you can’t know beforehand what can happen in the future, good or bad. That’s what allows the Linux kernel to become popular in medical devices as well as airplanes and rockets.
But the Meta policy prohibits use in several areas that might be highly beneficial to society, such as regulated/controlled substances and use for critical infrastructure. Even something that sounds as simple as “you must follow the law” is problematic in practice. What if the law in different places is inconsistent? What if the law is unjust?
Avoiding adding more confusion
The license for the Llama LLM is very plainly not an “Open Source” license. Meta is making some aspect of its large language model available to some, but not to everyone, and not for any purpose. OSI realizes how important it is to come to a shared understanding of what open means for AI systems. These are new human artifacts, much like software was a new creation of human intellect in the 70s. We’re running a series of events to craft a common definition of “open” in the AI context and we welcome submissions of ideas.