Sei sulla pagina 1di 5

Privacy Notice ×

TechTarget and its partners employ technology such as cookies to improve user experience, performance, for personalizing content and advertising. For more information, see
our Cookie Policy or to manage your preferences, click here. By closing this message box or continuing to use our site, you agree to our use of cookies and our Privacy OK
Policy.

9 g
KTSDESIGN - STOCK.ADOBE.COM
Search Computer Weekly

z f
1

Stanford University finds that AI is outpacing Moore’s Law


Every three months, the speed of artificial intelligence computation doubles, according to Stanford University’s 2019 AI Index report

1
2

Cliff Saran, Managing Editor


Published: 12 Dec 2019 9:56

Stanford University’s AI Index 2019 annual report has found that the speed of artificial intelligence (AI) is outpacing Moore’s Law.

DOWNLOAD THIS FREE GUIDE

Editor's pick: Top tools for software developers


Download this e-guide for an in-depth discussion into the pros and cons of using AI tools and how to build up your 'AI toolbox'.

Corporate E-mail Address:

I agree to TechTarget’s Terms of Use, Privacy Policy, and the transfer of my information to the United States for processing to provide me with relevant
information as described in our Privacy Policy.

I agree to my information being processed by TechTarget and its Partners to contact me via phone, email, or other means regarding information relevant to
my professional interests. I may unsubscribe at any time.

Download Now
Privacidad - Condiciones
Moore’s Law maps out how processor speeds double every 18 months to two years, which means application developers can expect a doubling in
application performance for the same hardware cost.

But the Stanford report, produced in partnership with McKinsey & Company, Google, PwC, OpenAI, Genpact and AI21Labs, found that AI
computational power is accelerating faster than traditional processor development. “Prior to 2012, AI results closely tracked Moore’s Law, with
compute doubling every two years.,” the report said. “Post-2012, compute has been doubling every 3.4 months.”

The study looked at how AI algorithms have improved over time, by tracking the progress of the ImageNet image identification program. Given that
image classification methods are largely based on supervised machine learning techniques, the report’s authors looked at how long it takes to train an
AI model and associated costs, which they said represents a measurement of the maturity of AI development infrastructure, reflecting advances in
software and hardware.

Their research found that over 18 months, the time required to train a network on cloud infrastructure for supervised image recognition fell from about
three hours in October 2017 to about 88 seconds in July 2019. The report noted that data on ImageNet training time on private cloud instances was in
line with the public cloud AI training time improvements.

The report’s authors used the ResNet image classification model to assess how long it takes algorithms to achieve a high level of accuracy. In
October 2017, 13 days of training time were required to reach just above 93% accuracy. The report found that training an AI-based image
classification over 13 days to achieve 93% accuracy would have cost about $2,323 in 2017.

The study reported that the latest benchmark available on Stanford DAWNBench , using a cloud TPU on GCP to run the ResNet model to attain
image classification accuracy slightly above 93% accuracy, cost just over $12 in September 2018.

Read more about artificial intelligence


Google Cloud has expanded its committed use discount plan to include GPU, TPU and local SSD resources to spark more AI and machine learning
workloads.

Huawei aims to speed up AI training times with the launch of a new processor in China and a new AI computing framework.

The report also explored how far computer vision had progressed, looking at innovative algorithms that push the limits of automatic activity
understanding, which can recognise human actions and activities from videos using the ActivityNet Challenge.

One of the tasks in this challenge, called Temporal Activity Localisation, uses a long video sequences that depict more than one activity, and the
algorithm is asked to find a given activity. Today, algorithms can accurately recognise hundreds of complex human activities in real time, but the report
found that much more work is needed.

“After organising the International Activity Recognition Challenge (ActivityNet) for the last four years, we observe that more research is needed to
develop methods that can reliably discriminate activities, which involve fine-grained motions and/or subtle patterns in motion cues, objects and
human-object interactions,” said Bernard Ghanem, associate professor of electrical engineering at King Abdullah University of Science and
Technology, in the report.

“Looking forward, we foresee the next generation of algorithms to be one that accentuates learning without the need for excessively large manually
curated data. In this scenario, benchmarks and competitions will remain a cornerstone to track progress in this self-learning domain.”

m Read more on Artificial intelligence, automation and robotics Privacidad - Condiciones


How to achieve explainability in AI models

Future Decoded: AI-powered data revolutionises software development

How to optimize storage for AI, machine learning and deep learning

3 important steps to get started with AI

Privacidad - Condiciones
AI for decision-making shows promise, but worker trust an issue

computer hallucination

m Join the conversation


z 1 comment

Share your comment

Send me notifications when other members comment.

Add My Comment

Oldest 5

[-] Wayne Caswell


- 15 Dec 2019 10:03 AM
t
I started with IBM in 1969 and watched exponentially faster processors enable new generations of software, which then enabled faster development of new hardware. The AI trend
identified here greatly accelerates that trend, making the task of predicting future impacts exceptionally difficult, and more so each year, or month. I’ll add a reference link to this after my
own article, “Moore’s Law and the Future of Healthcare.” (https://mHealthTalk.com/moores-law/)

Reply

Privacidad - Condiciones
-ADS BY GOOGLE

Ahorro ariston immergas

Ahorro hasta 35% con la condensacion ecocompatibles garatia y


servicio tecnico o cial
Tecnocalor

CIO SECURITY NETWORKING DATA CENTER DATA MANAGEMENT

5
SearchCIO

Transforming operations for successful cloud adoption

Still considering making the move to the cloud? Here are best practices and cloud-centric processes CIOs should follow to enable ...

IT workforce skews younger, says analysis of US data

The IT workforce is getting younger, according to government IT workforce data. The reasons for this are subject to a debate that...

About Us Meet The Editors Contact Us Privacy Policy Our Use of Cookies Advertisers Business Partners Media Kit Corporate Site

Contributors Reprints Archive Site Map Answers E-Products Events In Depth

Guides Opinions Quizzes Photo Stories Tips Tutorials Videos Computer Weekly Topics

All Rights Reserved,


Copyright 2000 - 2019, TechTarget

Privacidad - Condiciones

Potrebbero piacerti anche