MIT develops new neural network chips

Neural networks are very powerful, but they require a lot of energy. MIT engineers have now developed a new chip that can reduce the power consumption of neural networks by 95%, which may allow it to run on battery-powered mobile devices.

Smartphones are becoming smarter today, providing more and more artificial intelligence services such as digital assistants and real-time translation. However, the neural networks that perform data operations for these services are usually in the cloud, and the data of smartphones is also transmitted back and forth in the cloud.

This is not an ideal state because it requires a large amount of communication bandwidth, and this means that potentially sensitive data is being transmitted and stored on a server that is not under user control. However, the normal operation of the graphics processor's neural network requires a lot of energy, which makes it impractical to run a neural network on a device with limited battery power.

MIT develops new neural network chips

MIT engineers have now designed a chip that can dramatically reduce the need to transfer data back and forth between the chip's memory and the processor, reducing power consumption by 95%. A neural network consists of thousands of artificial neurons that are connected to each other in layers. Each neuron receives input from multiple neurons of its next layer, and if the combined input passes a particular threshold, it passes the output to multiple neurons in the upper layer. The strength of the connection between neurons is controlled by the weights set during training.

This means that for each neuron, the chip must retrieve the input data for a particular connection and the connection weights from memory, multiply them, store the results, and then repeat the process on each input. This requires a lot of data movement and therefore requires a lot of energy. MIT's new chip takes a different approach, using analog circuitry to compute all inputs in parallel in memory. This greatly reduces the amount of data that needs to be pushed and ultimately saves a lot of energy. This approach requires that the weight of the connection be binary rather than a series of values, but previous theoretical work has shown that this does not have much impact on the accuracy of the chip. The researchers found that the results of the chip basically consisted of running on a standard computer. Traditional non-binary neural networks are within 2% to 3%.

This is not the first time a researcher has created a chip that processes data in memory to reduce the power consumption of the neural network, but this is the first powerful convolutional nerve to run an image-based artificial intelligence application using this method. The internet. Dario Gill, IBM's vice president of artificial intelligence, said in a statement: "The results show that its performance is impressive when using memory arrays for convolution operations. It will definitely be the image of the future IoT and Video classification provides a more complex convolutional neural network."

However, not only the research team is studying this issue. The desire to equip smart phones, home appliances, and various IoT devices with artificial intelligence is driving Silicon Valley's big fans to switch to low-power artificial intelligence chips.

Apple has integrated its Neural Engine chip into the iPhone X to enhance its facial recognition technology. It is rumored that Amazon is developing its own custom AI chip for the next generation of Echo Digital Assistant. Large chip companies are also increasingly inclined to support advanced features like machine learning, which forces them to upgrade their devices and become more energy efficient. Earlier this year, ARM introduced two new chips: the ARM machine learning processor, which is aimed at artificial intelligence tasks, from translation to face recognition, and the other is ARM for detecting faces in images. Object detection processor.

Qualcomm's latest mobile chip, the Opteron 845, is equipped with a graphics processor and regards artificial intelligence as a top priority. The company also released the Snapdragon 820E chip, which is aimed at drones, robots and industrial equipment. In the longer term, IBM and Intel are developing a neuromorphic chip whose architecture is inspired by the human brain and its amazing energy efficiency. In theory, this allows IBM's TrueNorth chip and Intel's Loihi chip to run powerful machine learning in just a fraction of the energy required by traditional chips, but at this stage, the two technologies are still at a high level. Experimental stage.

Getting these chips to run as powerful as cloud computing services will be a huge challenge. But at the current rate of innovation, the day when you can reach true artificial intelligence is not too long.

Dental Chair

Dental chairs are one of the most widely used products in the dental field.The dental chair is used for inspection and treatment of oral surgery and oral diseases. Electric dental chairs are mostly used. The main structure is divided into 8 parts (Figure 1). The bottom plate of the whole machine is fixed on the ground, and the bottom plate is connected to the upper part of the dental chair through a bracket. The movement of the dental chair is controlled by the back of the chair. The working principle of the control switch is: the control switch starts the operation of the motor and drives the transmission mechanism to move the corresponding parts of the dental chair. According to the needs of treatment, by manipulating the control switch button, the dental chair can complete the movements of ascending, descending, pitching, tilting posture and resetting.

Dental Chair,Best Dental Chair,Dental Chair Price,Portable Dental Chair

Henan Hanchen Medical Technology Co., LTD , https://www.tchanchen.com