2150 位用户此时在线
        24小时点击排行 Top 10:
                    
                    
                    
                    
                    
                    
                    
                    
                    
                    
                    
                    
                    
                    
                    
                    
                    
                    
                    
                    
                    
                    
                    
                    
                    
            
    - 本站自动实时分享网络热点
 - 24小时实时更新
 - 所有言论不代表本站态度
 - 欢迎对信息踊跃评论评分
 - 评分越高,信息越新,排列越靠前
 
2
        1
        0
        
    
    
    
                                                Jensen Huang: It's easier to fall in love with what you do than to find what you love
“A lot of people say, ‘Find something you love.’ I don’t know about that. I guess I’ve fallen in love with many things that I do. I loved it when I was a dishwasher. I loved it when I was a 
                                  btc
                                (
                
                                        twitter.com
                
                )
            
                    
        
                                
                    
                 由
                 Startup Archive
                提交
            
            
                        
                    
        21
                    
                
                
                
            
        
        
            
                                                                            230k GPUs, including 30k GB200s, are operational for training Grok in a single supercluster called Colossus 1 (inference is done by our cloud providers).
            
            
                            
                                                
                                
                                     
                                                                            
                            
                                    
            
                    
        
        
    
        2
                1
                1
                
            
        
        
                                                                            230k GPUs, including 30k GB200s, are operational for training Grok in a single supercluster called Colossus 1 (inference is done by our cloud providers).
At Colossus 2, the first batch of 550k GB200s & GB300s, also for training, start going online in a few weeks.
As Jensen
                    
                    
                             btc
                                                            (
							
								                                 twitter.com)
						
                            
            
            
                            
                                            
                                                00:04:34
                                            
                                        
                                                                    
        22
                    
                
                
                
            
        
        
        
        
    
        
        
        
                    2
                1
                1