In a surprising shift from its usual principles, Meta has decided to offer its AI model Llama to the US government for protecting national security.
It will be used for simplifying and streamlining logistics, tracking terrorist funding, and bolstering cyber defense.
Many other tech giants such as Amazon, IBM, Oracle, and Microsoft have partnered up with Meta to offer a full-scale service to the government.
On Monday, Meta announced that it will now allow the government and agencies working on national security to use its AI model for military purposes.
The announcement was made by Mark Zuckerberg in last week’s earnings call and was later shared through a blog by Nick Clegg, Meta’s president of global affairs.
The decision is a surprising shift from Meta’s usual stance on this matter. Previously, it refrained from mingling its technology with military affairs.
But for reasons unknown, the company has changed its mind and has decided to allow the “responsible and ethical uses” of its AI model to the United States and “democratic values” in this cut-throat global competition for AI dominance.
About the Service
Now, speaking of the service that will be provided, for now only Meta’s Llama model will be used for military purposes. It will help them simplify complicated logistics, strengthen cyber defense, track terrorist funding, and so on.
The company has also partnered up with other tech firms such as Oracle, IBM, Amazon, Microsoft, Accenture, and others to provide a full-scale service to the government.
For example, Oracle will be using Llama to quickly process documents on aircraft maintenance while Amazon and Microsoft will be using it to host and process other national security data.
Along with that, it has also joined hands with a few defense contractors such as Lockheed Martin and Booz Allen, and defense-tech companies such as Palantir and Anduril.
What’s In It For Meta?
Meta isn’t charging for this service but it certainly has some incentives. After all, a company like Meta doesn’t do anything for charity. In the blog, Meta says that it just wants to do its part in fostering national security. However, there are clearly some hidden perks.
The first incentive is reputation. Partnering up with government agencies makes it come across as reliable and might encourage others to adopt Meta AI.
The second reason is that Chinese researchers have already been using Meta AI for military purposes. And given the current equation between China and the US (which is not very good, in case you didn’t know), the company probably didn’t want to seem like they were supporting an adversary.
And to prevent this from turning into a US-China standoff, it has also decided to open up access to Llama to other countries of the Five Eyes intelligence alliance which includes Canada, Australia, New Zealand, and the UK.
While on the surface this sounds like an amazing plan, it’s likely to attract some criticism. The first criticism might come from its internal team. We already know that a lot of US citizens do not appreciate the country’s stance on the Israel-Palestine war.
So at a time like this, a tech giant like Meta investing in strengthening the military might be looked down upon – we have already seen employee protests against Microsoft and Google.
The second issue is that Llama is an open-source AI model. Meta believes that keeping technology open-source is the key to innovation. But on the contrary, companies like OpenAI and Google have warned that AI is far too powerful to be made open-source.
Whatever the incentive and criticism might be, it seems to be a welcome move in strengthening the country’s overall military infrastructure.
The post Meta’s AI Model Llama to Be Used for Military Purposes appeared first on Techreport.
{Categories} _Category: Applications{/Categories}
{URL}https://techreport.com/news/metas-llama-used-for-military-purposes/{/URL}
{Author}Krishi Chowdhary{/Author}
{Image}https://techreport.com/wp-content/uploads/2024/11/Untitled-design-43-scaled.jpg{/Image}
{Keywords}News,Llama,meta,US military{/Keywords}
{Source}Applications{/Source}
{Thumb}{/Thumb}