1-dan master of the unyielding fist of Bayesian inference
6186 stories
·
1 follower

Meta announces nuclear energy projects

1 Share

Article URL: https://about.fb.com/news/2026/01/meta-nuclear-energy-projects-power-american-ai-leadership/

Comments URL: https://news.ycombinator.com/item?id=46578497

Points: 270

# Comments: 345

Read the whole story
clumma
3 minutes ago
reply
Berkeley, CA
Share this story
Delete

Hyundai Introduces Its Next-Gen Atlas Robot at CES 2026 [video]

1 Share

Article URL: https://www.youtube.com/watch?v=9e0SQn9uUlw

Comments URL: https://news.ycombinator.com/item?id=46520508

Points: 131

# Comments: 114

Read the whole story
clumma
3 minutes ago
reply
Berkeley, CA
Share this story
Delete

Six-decade math puzzle solved by Korean mathematician

1 Share

Article URL: https://www.koreaherald.com/article/10648326

Comments URL: https://news.ycombinator.com/item?id=46491129

Points: 260

# Comments: 85

Read the whole story
clumma
4 minutes ago
reply
Berkeley, CA
Share this story
Delete

ACM Is Now Open Access

1 Share

Article URL: https://www.acm.org/articles/bulletins/2026/january/acm-open-access

Comments URL: https://news.ycombinator.com/item?id=46454763

Points: 337

# Comments: 51

Read the whole story
clumma
4 minutes ago
reply
Berkeley, CA
Share this story
Delete

Cystathionine γ-lyase is a major regulator of cognitive function through neurotrophin signaling and neurogenesis | PNAS

1 Share
submitted by /u/Eonobius to r/longevity
[link] [comments]
Read the whole story
clumma
14 hours ago
reply
Berkeley, CA
Share this story
Delete

Computational Depth

1 Share

I'm posting from Oxford University where I will be spending the "Hilary Term" (through late March) as a visiting fellow at Magdalen College. If you are relatively local, reach out if you'd like to connect.

I plan to get back into research after thirteen years of administration, working primarily with Rahul Santhanam and his group. I haven't had a significant sabbatical or leave since Amsterdam 30 years ago, which is what comes from changing jobs too often.

Today I'd like to talk about a 2006 paper about a topic I first thought about in Amsterdam and will likely play a role in this visit, Computational Depth: Concept and Applications by Luis Antunes, Dieter van Melkebeek, Vinod Variyam and myself. 

In Amsterdam, I was hosted by Paul Vitányi and Harry Buhrman at CWI, and naturally worked on Kolmogorov complexity, the algorithmic study of randomness, as well as various problems in computational complexity.

Very simple string strings don't have much information. Completely random strings have maximal Kolmogorov complexity but not particularly useful either as we can create our own random strings by flipping coins. Is there some way to measure useful information? 

In particular I had this question: If NP reduces to a sparse set then it has polynomial-size circuits. If NP reduces to a random set then it has polynomial-size circuits. Is this just coincidence or is there a broader principle here?

Charlie Bennett developed a notion of logical depth that has a similar philosophy but it is a difficult definition to work with. Luis Antunes, a student from Porto, came to work with me at NEC Research in New Jersey when I was there around the turn of the century. We developed this simpler notion of computational depth as the difference of two Kolmogorov measures, like time-bounded Kolmogorov complexity minus traditional Kolmogorov complexity. This would be small for both very easy strings and full random strings. With then NEC postdoc (and now Nebraska professor) Vinod Variyam, we found a connection to finding SAT witnesses and with DIMACS and IAS postdoc Dieter van Melkebeek (now Wisconsin professor), we came up with the notion of shallow sets, that generalized both random and sparse sets and, as hoped, if NP reduces to a shallow set than it has polynomial-sized circuits. Luis would title his PhD thesis "Useless Information".

Luis and I would later find a nice connection of computational depth to average-case complexity. 

Computational Depth had a resurgence of popularity with the rise of meta-complexity, with 50 of the original paper's 90 citations coming since 2020.

So I hope to find new applications of computational depth working with Rahul who's at the center of meta-complexity. Maybe computational complexity can help us understand machine learning better based on my frozen concentrate analogy, where the learning process removes the randomness, leaving structured information behind. 

Read the whole story
clumma
2 days ago
reply
Berkeley, CA
Share this story
Delete
Next Page of Stories