Google’s Semi-Conductor Lets You Conduct An Orchestra In Your Browser

Are you a kind of person who you like music and also technology, then this Google’s new experiment will definitely make your mind blow. Google launched its new experiment called Semi-Conductor, and it is a cocktail of both music and technology, especially machine learning, that serves directly in your browser.

The Semi-Conductor is an AI that lets you conduct an orchestra in your browser, just wave your arms in front of the camera and the flow of music will be on your command. The machine learning model tracks your movements and based on that it generates a real-time music.

If you want to try, just go to Semi-Conductor in Google Chrome or in any browser that you like. Once you are on the Semi-Conductor website, give it access to your webcam. After that keep your laptop or cell phone on a stable surface, your screen will show you an instruction like this “Fit your body in the frame”.

Take a few steps back up to a such a distance that you both arms are visible on your device in that frame. Once you get into your comfort position, start conducting.

The Semi-Conductor also suggest users the traditional gesture like “move up and down to play louder and softer” or “move from side-to-side to control which sections play”, so the user can get an early start in it.

You can move your arms to change the tempo, volume, and instrumentation of a piece of music.

Google's Semi Conductor Lets You Conduct An Orchestra In Your Browser Google

Up to now with Semi-Conductor, you can play only one song,  the first movement of Mozart’s K 525 Serenade No. 13 “Eine kleine Nachtmusik,” which should be immediately familiar to anyone after the first few seconds.

Semi-Conductor is the latest is a long line of machine learning experiments. It uses Tensorflow.js, a machine learning library that works in the browser, to map out your movements through your webcam. An algorithm plays along to the score as you conduct, using hundreds of tiny audio files from live recorded instruments.

It works using PoseNet – a machine-learning model that allows for the estimation of human poses within your browser. PoseNet detects human figures in images and videos without any specialized hardware or software.

Past examples have included AI Duet (which invites you to play a virtual piano in your web browser and provides a virtual accompaniment), and AutoDraw (which interprets your rough doodles and turns them into neat clipart).

Best of all, because all the processing happens within your browser, no data is sent to Google or anyone else, so there’ll be no evidence of your embarrassing flailing.

Semi-Conductor is built by Rupert Parry, Melissa Lu, Haylie Craig and Samantha Cordingley from Google Creative Lab, Sydney.

More in AI :

Five Python Libraries To Learn For Machine Learning In 2019

New Machine Learning Model Helps To Predict Volcanic Eruptions

Nivida’s new GPU TITAN RTX has Monster power for Deep Learning

Google’s Machine Learning Model Decode Humpback Whale Songs

Machine learning recognizes cryptocurrency scams before they happen

via Google

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.