John Hopfield significantly advanced the study of structures that can store and reconstruct information by introducing what is now known as Hopfield Networks - a type of recurrent artificial neural network. He proposed this model of neural network with symmetric weights in 1982. Hopfield networks serve as content-addressable ("associative") memory systems with binary threshold nodes, meaning they can recognize patterns and remember information. The networks work on the simple principle that 'neurons that fire together, wire together', which helps in completely recalling full memories from partial inputs, thus contributing significantly to our understanding of memory structures and pattern recognition in the brain. His work has been instrumental in the development of machine learning and artificial intelligence. He demonstrated that these neural networks could learn and process information in a way that is strikingly similar to the human brain, thereby opening new potential avenues of AI development. -- by ChatGPT
Geoffrey Hinton, a renowned computer scientist and cognitive psychologist, is best known for his work on artificial neural networks. In the 1980s, he, along with his colleagues, proposed a type of neural network called the Boltzmann Machine, which can detect patterns and trends in data. Later in the early 2000s, Hinton invented another method known as Contrastive Divergence" as a fast way to train Boltzmann machines. His most significant contribution is the development of a concept called Deep Learning." The crux of the method is using algorithmic layers (hence the term 'deep') to filter inputs (like photo pixels) and gradually define them into complex outputs (like the identification of a face). Hinton’s work on deep learning holds great relevance to large neural networks such as Google’s Search algorithms or Facebook's facial recognition technology. His inventions have set a much-needed foundation for automatic feature detection in large neural networks from raw, unstructured data. This data is typically harder for machines to understand, hence his work is revolutionary in increasing machines’ capabilities to learn and understand complex data efficiently. -- by ChatGPT 诺奖说的 “a method that can independently discover properties in data and which has become important for the large neural networks” 是哪个贡献呢?是 Back Propagation 还是 Deep Learning?
为美国股市做出巨大贡献
openAI了解一下
还不如给英伟达的老黄,算力之父
韩家真是个传奇。
这不是应该的吗?科技带动生产力,股市给科技公司拉投资
AI有A没I,生产力都在中国的机器人。
想凭现在人类的线性连续数学+计算机算力搞I,就是痴心妄想。
质疑的。 除非AI能改变人类基因 再造荷尔蒙和血液 否则就是人工可可粉和天然咖啡的区别
万物有灵有魂 机器没有
这个可能要他们那个重要专利的第一作者拿。
声音识别,图像识别,物体检测,物体分割, 自然语言学习包括语言翻译
可就是AI得了物理奖这个我想不通,我肯定是对物理有误解,这简直不可思议
同意楼上,说他们的成就超越诺奖级别也许我能同意,但是给他们诺贝尔物理奖我有点接受不了
Hopfield networks serve as content-addressable ("associative") memory systems with binary threshold nodes, meaning they can recognize patterns and remember information. The networks work on the simple principle that 'neurons that fire together, wire together', which helps in completely recalling full memories from partial inputs, thus contributing significantly to our understanding of memory structures and pattern recognition in the brain.
His work has been instrumental in the development of machine learning and artificial intelligence. He demonstrated that these neural networks could learn and process information in a way that is strikingly similar to the human brain, thereby opening new potential avenues of AI development.
-- by ChatGPT
Later in the early 2000s, Hinton invented another method known as Contrastive Divergence" as a fast way to train Boltzmann machines.
His most significant contribution is the development of a concept called Deep Learning." The crux of the method is using algorithmic layers (hence the term 'deep') to filter inputs (like photo pixels) and gradually define them into complex outputs (like the identification of a face).
Hinton’s work on deep learning holds great relevance to large neural networks such as Google’s Search algorithms or Facebook's facial recognition technology. His inventions have set a much-needed foundation for automatic feature detection in large neural networks from raw, unstructured data. This data is typically harder for machines to understand, hence his work is revolutionary in increasing machines’ capabilities to learn and understand complex data efficiently.
-- by ChatGPT
诺奖说的 “a method that can independently discover properties in data and which has become important for the large neural networks” 是哪个贡献呢?是 Back Propagation 还是 Deep Learning?
应该只能怪诺奖没有数学奖 只好扒拉到物理名下了
又见 灵魂 论调。 哎, 要是人类没有灵魂, 现在真是 惨死了。
这些年的趋势是 生物医学奖 化学化, 化学奖 物理化, 物理奖 看来也要数学化了
太阳光需要八分钟才能到达地球, 但光愣是认为自己可以瞬时到达宇宙任何地方。 不靠谱。
AI开挖掘机和蓝翔学徒开挖掘机比有什么优势?
万一翻车了,不用担心司机挂了....
那位根本就不懂大语言模型,也不懂人类。
😂, 是怎么说服他不是骗子的。这个电话也真会挑时间
应该说物理的本质就是数学,或者说世界的本质就是数学,很多天文物理现象都是先由数学推算出来,以后才被观察到的
合理怀疑,我们就是生活在matrix 里
低手都可以看出小于1