Google and Project Maven
VentureBeat led its AI newsletter today (June 1, 2018) with a discussion about Google’s involvement in the Pentagon’s Project Maven.
Perhaps because of Google’s legacy of public mottos and codes of conduct that state it doesn’t want to be evil (which it recently removed), there is more backlash against Google’s participation in a military project called Maven than there is for aerospace companies that routinely partner with the military.
Project Maven essentially seeks to leverage pattern recognition to identify and classify objects seen by military drones. The specifications align well with many of Google’s software development competencies.
AI’s military history
As VentureBeat’s Kari Johnson points out, there is a long association between the military and AI. The Defense Advanced Research Projects Agency or DARPA funded much of the earliest AI work. At Hughes Aircraft I personally worked on multiple DARPA-related projects, including writing a paper on adaptive manufacturing using agents. I also participated in oversight of the General Motors Microelectronics and Computer Technology Corporation (MCC) investment. MCC was a national response to the Japanese Fifth Generation Project, a project seen in the 1980s as a strategic threat to the United States dominance in microelectronic and software. DARPA, along with several commercial firms, formed MCC to fund, among other software and hardware projects, Cyc, which at the time was the largest AI project ever conceived (Now independent Cycorp can be found here).
Much of the AI in use today likely spawned from some military funding associated with the basic concepts employed, if not the actual software or algorithms in use.
Why does Google need to partner with the military?
With a military robotics market predicted to reach 30.83 Billion USD by 2022 (according to a Markets and Markets report) it would make sense that one of the largest software companies would like to tap into that financial stream. But that probably isn’t the core answer to why?
Google knows that military challenges will drive learning. Commercial applications benefit from the discovery of the underlying principles required to solve new problems. That reality forms the foundation of the intertwined history between commercial AI and its military antecedents.
We know Google already conducts vision and pattern recognition research. The company offers apps like Arts and Culture that includes an AI-empowered vision feature to match a consumer’s face to a famous painting. Google has plenty of money to focus on basic AI and vision research. Google likely perceives Project Maven as a source of requirements, use cases, and data to fuel its internal learning.
Like it or not, the military offers learning opportunities that cannot be duplicated outside the laboratory of conflict.
Google doesn’t need a policy it needs a strategy
Also according to VentureBeat, Google is drafting a military project policy (see Kyle Wiggers post here). Google does need to reconcile its commercial consumer relationship with any of its military work. They don’t need a policy, they need a strategy. And that strategy needs to create a new company focused on obtaining and managing military contracts.
Google has already taken over the airship hangers at Moffet Field, which would make idea test zones for drone vision testing. They should start with a small organization in the Moffet Field area, take on those assets and build a management team that understands the military-aerospace business model.
A new division would focus on competing with the likes of Raytheon and Northrop Grumman (or facilitating coopetition). It would also insulate Google from the management requirements and processes of military contracting much more effectively than trying to conflate those processes into its commercial operations.
Of course, Google employees should be especially appreciative of military investments in information technology. Without the commercialization of ARPANet Google would not exist.
For more on AI and the military, see the Serious Insights discussion on AI and ethics here.
Leave a Reply