Teanut

joined 2 years ago
[–] Teanut@lemmy.world 3 points 1 month ago

Please don't make troughs become a thing again!

[–] Teanut@lemmy.world 14 points 1 month ago

If you use the profiler and see that the slower operation is being used frequently, and is taking up a chunk of time deemed significant, why not swap it to the faster version?

In a simulation I'm working on that goes through 42 million rounds I spent some time profiling and going through the code that was eating up a lot of time (especially things executed all 42 million times) and trying to find some optimizations. Brought the run time down from about 10 minutes to 5 minutes.

I certainly wasn't going to start over in C++ or Rust, and if I'd started with either of those languages I would have missed out on a lot of really strong Python libraries and probably spent more time coding rather than refining the simulation.

[–] Teanut@lemmy.world 1 points 2 months ago (1 children)

How rough was the switch? I can't imagine changing all my accounts that use my Gmail address, but I'd like to move to something else.

[–] Teanut@lemmy.world 4 points 3 months ago

Lucky you! I need to check my university's current GPU power but sadly my thesis won't be needing that kind of horsepower, so I won't be able to give it a try unless I pay AWS or someone else for it on my own dime.

[–] Teanut@lemmy.world 7 points 3 months ago (2 children)

nVidia's new Digits workstation, while expensive from a consumer standpoint, should be a great tool for local inferencing research. $3000 for 128GB isn't a crazy amount for a university or other researcher to spend, especially when you look at the price of the 5090.

[–] Teanut@lemmy.world 49 points 3 months ago (10 children)

In fairness, unless you have about 800GB of VRAM/HBM you're not running true Deepseek yet. The smaller models are Llama or Qwen distilled from Deepseek R1.

I'm really hoping Deepseek releases smaller models that I can fit on a 16GB GPU and try at home.

[–] Teanut@lemmy.world 11 points 4 months ago (3 children)

Cellulose isn't plastic though, it's the sugar that makes up plant cell walls, like wood. Cotton fibers are 90% cellulose https://en.wikipedia.org/wiki/Cellulose

I'm confused why they included cellulose without clarifying that it's not a petrochemical, unless cellulose micro and nano particles are also an issue now. Maybe I should read the original study...

[–] Teanut@lemmy.world 23 points 4 months ago (3 children)

I did notice this news article that mentions:

New Yorkers who live within the Congestion Relief Zone will not be charged to drive or park around the area. They will only be charged once they leave and cross back into the zone.

[–] Teanut@lemmy.world 34 points 4 months ago (11 children)

I believe OP is talking about NYC's Congestion Pricing.

[–] Teanut@lemmy.world 19 points 6 months ago (2 children)

Wild that dentist doesn't have any restrictions but doctor does.

[–] Teanut@lemmy.world 19 points 7 months ago (1 children)

Linux is a lot, lot, lot easier to use now than the 90s.

[–] Teanut@lemmy.world 32 points 8 months ago* (last edited 8 months ago) (1 children)

Mobile phones in the era before smartphones had cameras, email clients, games, music players, and even web browsers. They just weren't very good at those functions and their core feature was being a phone for voice calls. Texting was barely a feature on some of them (the first camera phone in the United States, the Sanyo SCP-5300, didn't have a two way text messaging client - the user had to go to a website on the phone to send texts, which was inconvenient even on a 1xRTT 3G connection.)

The e-ink phone seems closer to a dumbphone than a smartphone, IMO, largely because it lacks access to an app store.

Source: I sold mobile phones before smartphones and during the early smartphone years (BlackBerry and Palm Treo, for example.)

Edit: calling it a feature phone instead of a dumb phone might be more accurate.

view more: next ›