Lenguador

joined 1 year ago
 

Achieves SOTA on quality AND on training time AND renders in real-time (60fps+)

 

Greatly improves Stable Diffusion's issues of missing objects and mixing up attributes

[–] [email protected] 3 points 1 year ago (1 children)

From Wikipedia: this is only a 1-sigma result compared to theory using lattice calculations. It would have been 5.1-sigma if the calculation method had not been improved.
Many calculations in the standard model are mathematically intractable with current methods, so improving approximate solutions is not trivial and not surprising that we've found improvements.

 

Siggraph 2023, Nvidia improves on their previous research into controllable, natural movement learnt from unlabelled data. Code and paper available.

[–] [email protected] 1 points 1 year ago

This seems like more of an achievement for the Barbie brand than for the individual director.

 
 
[–] [email protected] 2 points 1 year ago* (last edited 1 year ago)

Apparently Inflection AI have bought 22,000 H100 GPUs. The H100 has approximately 4x the compute for transformers as the A100. GPT4 is rumored to be 10x larger than GPT3. GPT3 takes approximately 34 days to train on 1024 A100 GPUs.

So with 22,000*4/1024=85.9375x more compute, they could easily do 10x GPT4 size in 1-2 months. Getting to 100x the size would be feasible but likely they're banking on the claimed speedup of 3x from FlashAttention-2, which would result in about 6 months of training.

It's crazy that these scales and timelines seem plausible.

 

"We are about to train models that are 10 times larger than the cutting edge GPT-4 and then 100 times larger than GPT-4. That’s what things look like over the next 18 months."

 

Up to 100% improvement on unseen tasks, environments, and backgrounds

 
 
[–] [email protected] 4 points 1 year ago

This is an essay about the Barbie brand and its relationship to feminism and capitalism through history and the modern day. The Barbie movie is discussed but it's not the primary focus.

[–] [email protected] 2 points 1 year ago

NGC 1277 is unusual among galaxies because it has had little interaction with other surrounding galaxies.

I wonder if interactions between galaxies somehow converts regular matter to dark matter.

[–] [email protected] 1 points 1 year ago

Oh certainly, that series took quite a risk on writing style and it's quite divisive.
If you enjoy fantasy, you could try her other series as an alternative. The Inheritance Trilogy is a more standard writing style.

[–] [email protected] 4 points 1 year ago (2 children)

I almost put The Fifth Season down after the first chapter, I remember thinking: "This author has a chip on their shoulder". I'm glad I persevered though, and I definitely recommend the series to people as it is quite different. I'd suggest giving it another shot.

[–] [email protected] 1 points 1 year ago (1 children)

I might try jumping in again on season 2, thanks.

[–] [email protected] 1 points 1 year ago

Claude 2 would have a much better chance at this because of the longer context window.
Though there are plenty of alternate/theorised/critiqued endings for Game of Thrones online, so current chatbots should have a better shot at doing a good job vs other writers who haven't finished their series in over a decade.

[–] [email protected] 2 points 1 year ago (4 children)

As a counterpoint to other comments here, I didn't like Babylon 5. I gave up in the first season on the episode about religions, where each alien race shows a single religion but then humanity shows an enormous number of them.

Showing planets in sci fi as homogenous is a common trope, but such a simplistic take. This resonated poorly with me as I felt the aliens all behaved exactly like humans as well, to the point where you have stand-ins for Jehovah's witnesses. That episode cemented for me the feeling I had when watching. Babylon 5 is racist against aliens.

 
 
[–] [email protected] 3 points 1 year ago (1 children)

This looks amazing, if true. The paper is claiming state of the art across literally every metric. Even in their ablation study the model outperforms all others.

I'm a bit suspicious that they don't extend their perplexity numbers to the 13B model, or provide the hyper parameters, but they reference it in text and in their scaling table.

Code will be released in a week https://github.com/microsoft/unilm/tree/master/retnet

[–] [email protected] 2 points 1 year ago

Why do you say they have no representation? There are a lot of specific bodies operating in the government, advisory and otherwise, with the sole focus of indigenous affairs. And of course, currently, indigenous Australians are over represented in terms of parliamentarian race (more than 4% if parliamentarians are of indigenous descent).

view more: next ›