Facebook’s chief AI scientist: Deep learning may need a new programming language

///Facebook’s chief AI scientist: Deep learning may need a new programming language

Facebook’s chief AI scientist: Deep learning may need a new programming language

Facebook’s chief AI scientist: Deep learning may need a new programming language. Deep learning may need a new programming language that’s more flexible and easier to work with than Python, Facebook AI Research director Yann LeCun said today. It’s not yet clear if such a language is necessary, but the possibility runs against very entrenched desires from researchers and engineers, he said.

LeCun has worked with neural networks since the 1980s.

“There are several projects at Google, Facebook, and other places to kind of design such a compiled language that can be efficient for deep learning, but it’s not clear at all that the community will follow, because people just want to use Python,” LeCun said in a phone call with VentureBeat.

“The question now is, is that a valid approach?”

Python is currently the most popular language used by developers working on machine learning projects, according to GitHub’s recent Octoverse report, and the language forms the basis for Facebook’s PyTorch and Google’s TensorFlow frameworks.

LeCun presented a paper exploring the latest trends and spoke before companies making next generation computer chips at the IEEE’s International Solid-State Circuits Conference (ISSCC) today in San Francisco.

The first portion of the paper is devoted to lessons LeCun took away from Bell Labs, including his observation that the AI researchers’ and computer scientists’ imaginations tend to be tied to hardware and software tools.

Artificial intelligence is more than 50 years old, but its current rise has been closely linked to the growth in compute power provided by computer chips and other hardware.

A virtuous cycle of better hardware causing better algorithms, causing better performance, causing more people to build better hardware is only a few years old, said LeCun, who worked at Bell Labs in the 1980s and made ConvNet (CNN) AI to read zip codes on postal envelopes and bank checks.

In the early 2000s, after leaving Bell Labs and joining New York University, LeCun worked with other luminaries in the space, like Yoshua Bengio and Geoffrey Hinton, conducting research to revive interest in neural networks and grow the popularity of deep learning.

In recent years, advances in hardware — like field programmable gate arrays (FPGA), tensor processing units (TPU) from Google, and graphics processing units (GPU) — have played a major role in the industry’s growth. Facebook is reportedly also working on its own semiconductor.

“The kind of hardware that’s available has a big influence on the kind of research that people do, and so the direction of AI in the next decade or so is going to be greatly influenced by what hardware becomes available,” he said. “It’s very humbling for computer scientists, because we like to think in the abstract that we’re not bound by the limitation of our hardware, but in fact we are.”

LeCun highlighted a number of AI trends hardware makers should consider in the years ahead and made recommendations about the kind of architecture needed in the near future, recommending that the growing size of deep learning systems be taken into consideration.

He also spoke about the need for hardware designed specifically for deep learning and hardware that can handle a batch of one, rather than needing to batch multiple training samples to efficiently run a neural net, which is the standard today.

“If you run a single image, you’re not going to be able to exploit all the computation that’s available to you in a GPU. You’re going to waste resources, basically, so batching forces you to think about certain ways of training neural nets,” he said.

He also recommended dynamic networks and hardware that can adjust to utilize only the neurons needed for a task.

In the paper, LeCun reiterated his belief that self-supervised learning will play a major role in advancing state-of-the art AI.

“If self-supervised learning eventually allows machines to learn vast amounts of background knowledge about how the world works through observation, one may hypothesize that some form of machine common sense could emerge,” LeCun wrote in the paper.

LeCun believes that future deep learning systems will largely be trained with self-supervised learning and that new high-performance hardware will be needed to support such self-supervised learning.

Last month, LeCun discussed the importance of self-supervised learning with VentureBeat as part of a story about predictions for AI in 2019. Hardware that can handle self-supervised learning will be important for Facebook, as well as for autonomous driving, robotics, and many other forms of technology.


Related Videos:

Related Links:

Designing an app in Pseudo code

Computer Programming Business Requirements Analysis

How to Fix ERR_TOO_MANY_REDIRECTS on Your WordPress Site

Methods of teaching programming

Object-Oriented Programming (OOP)

Comparing Python to other languages

The 14 most popular programming languages, according to a study of 100,000 developers

Philips devs are coding algorithms that help detect cancer accurately

Introduction to JavaScript

Introduction to JavaScript – CONSOLE

Introduction to JavaScript – Variables: Review

Hacking the IoT: Vulnerabilities and Prevention Methods

Hackers Have Just Put 620 Million Accounts Up For Sale On The Dark Web — Are You On The List?

USB O.MG cable opens Wi-Fi to remote attacks

Thousands of Android apps have been creating a permanent record of everything you do

FAQs

By |2019-04-25T21:44:20-04:00April 25th, 2019|Categories: Tech News|Tags: , , |

About the Author:

I am a loving father, & husband. I am a computer enthusiast. I have used and enjoyed computers since I was young and I enjoy teaching young minds how to code, because it teaches them how to think. Today with YouTube, and social media garbage our youth are losing the ability to think on their own and solve problems. I believe this is a serious epidemic as kids today dont understand that technology is a tool. This tool is being abused, and its underlying effects are taking its toll on kids behaviour, and learning.