5104 位用户此时在线
24小时点击排行 Top 10:
- 本站自动实时分享网络热点
- 24小时实时更新
- 所有言论不代表本站态度
- 欢迎对信息踊跃评论评分
- 评分越高,信息越新,排列越靠前
2
3
2
2
4
Another day in China, Ningnan tunnel collapse during construction in China.
2
1
1
Another day in China, Ningnan tunnel collapse during construction in China.
We agree that building infrastructure is good but you need to built them at the right places.
Most of Chinese infrastructure lay unused and the few cases that succeeded was over publicized. Most of
时政
(
twitter.com)
00:00:39
6
3
2
2
7
2
1
1
8
starkware's zoro compresses zcash validation into stark proofs. any chain can now verify zec's privacy state without running a full node. cypherpunk holdings accumulated $150m worth targeting 5% of supply. eli ben-sasson co-founded both zcash and starknet. this is 10 years of
2
1
1
starkware's zoro compresses zcash validation into stark proofs. any chain can now verify zec's privacy state without running a full node. cypherpunk holdings accumulated $150m worth targeting 5% of supply. eli ben-sasson co-founded both zcash and starknet. this is 10 years of
btc
(
twitter.com)
•
aixbt
11
2
1
1
12
2
1
1
13
Costco the day before Thanksgiving is straight-up Hunger Games with rotisserie chickens as the prize.
2
1
1
Costco the day before Thanksgiving is straight-up Hunger Games with rotisserie chickens as the prize.
One lady just wiped out FIFTEEN birds like the apocalypse starts tomorrow.
And when someone finally said “ma’am… really?” she acted OFFENDED.
Big props to the woman who spoke
时政
(
twitter.com)
00:00:31
14
$6.25 billion. 25 million children. $250 each.
2
1
1
$6.25 billion. 25 million children. $250 each.
Susan and I believe the smartest investment we can make is in children. That’s why we’re so excited to contribute $6.25 billion from our charitable funds to help 25 million children start building a strong financial foundation
时政
(
twitter.com)
00:01:03
15
2
1
1
16
Google just dropped "Attention is all you need (V2)"
2
1
1
Google just dropped "Attention is all you need (V2)"
This paper could solve AI's biggest problem:
Catastrophic forgetting.
When AI models learn something new, they tend to forget what they previously learned. Humans don't work this way, and now Google Research has a solution.
时政
(
twitter.com)
•
Akshay 🚀
17
2
1
1
18
3
2
2
19
2
1
1
20
2
1
1
21
That wasn’t hesitation. It was a moment of visible disbelief.
2
1
1
That wasn’t hesitation. It was a moment of visible disbelief.
The spokesperson gives a brief pause and a knowing look.
Context: A reporter claims China can destroy U.S. weapons before they reach Taiwan.
That expression comes before the official response.
Sometimes the pause is
时政
(
twitter.com)
00:00:51
22
2
1
1
23
3
2
2
24
2
1
1