A surprising survey
<p>The answer is yes and within five to 10 years, according to 37% of respondents to a survey issued at the <a href=”https://www.hlai-conf.org/” target=”_blank”>Joint Multi-Conference on Human-Level Artificial Intelligence</a> (HLAI) held last month in Prague.</p><p>The survey, which was conducted by the AI startup SingularityNET and the AI research and development company GoodAI, found that 28% expected it within the next two decades while just 2% didn’t believe humans will ever develop AGI.</p><p>The survey also asked respondents to rate the sectors in which they thought AI could have the greatest impact. The results broke down like this:<span></span></p><ul><li>Healthcare (46%)</li><li>Logistics (41%)</li><li>Customer service (38%)</li><li>Banking and finance (34%)</li><li>Agriculture; retail, software development; manufacturing (28%)</li></ul>”It’s no secret that machines are <a href=”https://www.futuretimeline.net/21stcentury/images/future-timeline-technology-singularity.jpg” target=”_blank” class=”hoverZoomLink”>advancing exponentially</a> and will eventually surpass human intelligence,” said Ben Goertzel, SingularityNET’s CEO and creator of the software behind a social, humanoid robot named <a href=”https://en.wikipedia.org/wiki/Sophia_(robot)” target=”_blank”>Sophia</a>. “But, as these survey results suggest, an increasing number of experts believe this ‘Singularity’ point may occur much sooner than is commonly thought. Artificial general intelligence at the human level or beyond, as many respondents to our poll noted, could very well become a reality within the next decade.”
A 2016 survey of AI researchers who had been published in top peer-reviewed journals found slightly less exciting results. The survey makers asked respondents to rate how many years it would be before AI possessed “high-level machine intelligence,” which they defined as being “achieved when unaided machines can accomplish every task better and more cheaply than human workers.”The respondents were asked about specific AI milestones, such as when AI would be outperform humans in complex tasks like surgery.
Grace et al., 2018.
Timelines showing 50% probability intervals for achieving selected AI milestones based on survey respondent opinions. Specifically, intervals represent the date range from the 25% to 75% probability of the event occurring. Circles denote the 50%-probability year that AI will achieve or exceed human performance.
<p>The survey paper concludes with researchers suggesting that, though there are many reasons to be optimistic about developments in AI, researchers in the field are sometimes no better at predicting the future than crude statistical representations.</p><p>Some experts who attended the recent HLAI conference voiced similar caution.</p><p>”At the moment, there is absolutely no indication that we are anywhere near AGI,” <a href=”https://www.itu.int/en/fnc/Pages/bios/BERIDZEIrakli.aspx” target=”_blank”>Irakli Beridze</a>, Head of the Centre for Artificial Intelligence and Robotics, told <em><a href=”https://futurism.com/human-level-artificial-intelligence-agi/” target=”_blank”>Futurism</a></em>. “And no one can say with any kind of authority or conviction that this would happen within a certain time frame. Or even worse, no one can say this can even happen period. We may never have AGI, so we need to take that into account when we are discussing anything.”</p><p>Still, there are a few trends helping to propel the development of AGI. These include, as AI venture capitalist Matt Turck detailed in a recent <a href=”https://hackernoon.com/frontier-ai-how-far-are-we-from-artificial-general-intelligence-really-5b13b1ebcd4e” target=”_blank”>blog post,</a> increased access to AI tools and education, an uptick in AI research in major internet companies like Google and Facebook, the ever-increasing amount of available data with which researchers can train AI, massive accelerations in computing power, and progress <a href=”https://www.technologyreview.com/s/612190/why-alibaba-is-investing-in-ai-chips-and-quantum-computing/” target=”_blank”>in quantum</a> and optical computing. But, ultimately, only time will tell.</p>