ubergeek77

joined 2 years ago
[–] ubergeek77@lemmy.ubergeek77.chat 22 points 3 weeks ago (2 children)

Be careful, that's an AI generated review. Definitely try to find other human reviews before you buy it.

[–] ubergeek77@lemmy.ubergeek77.chat 40 points 3 weeks ago* (last edited 3 weeks ago) (3 children)

If you go through that user's other reviews, while they are very funny, they're also very AI generated. Tons of contrivances and the classic "and honestly?"

They were almost all posted on April 13th, and they all drop the "brand name" of the product in just really unnatural ways. This is 1,000% a bot.

Do go visit the profile though. The sex toy reviews are pretty funny.

If you want some quick examples of how this is obviously AI:

Ah I see. I've gotten into some back and forth with the maintainer of Caddy about these build inconsistencies and lack of versioning, and he responded by locking issues on me... twice.

After that communication, I resigned to just keep things as they are because upstream is not willing to make the experience better for use cases like these.

He is impossible to work with, but Caddy is just really good :/

[–] ubergeek77@lemmy.ubergeek77.chat 1 points 3 weeks ago (2 children)

What issue did you have? If I can handle it for the user automatically, I can add a best effort attempt to avoid it.

[–] ubergeek77@lemmy.ubergeek77.chat 11 points 3 weeks ago* (last edited 3 weeks ago) (4 children)

Hi!

The project is still active! I just haven't needed to develop any updates because it's currently stable, and the upstream Docker compose template, which is a reference for my project, hasn't changed in 9 months:

https://github.com/LemmyNet/lemmy-ansible/blob/main/templates/docker-compose.yml

Whenever Lemmy releases a new Docker container, it will just pull the latest one automatically. This will continue to be compatible indefinitely until the compose deployment (linked above) has any breaking changes added to it that I need to incorporate.

Thank you for the shout out :)

Ah I see, that makes sense. Thanks for explaining, I learned something from your comment and the other one.

[–] ubergeek77@lemmy.ubergeek77.chat 5 points 1 month ago* (last edited 1 month ago) (6 children)

I am not up to date on all these license debates, but don't you think equating Alpine Linux to "locked down DRM" is just a bit of a logical reach?

Alpine and its components are fully open source, you can make whatever changes you want to them. I am not seeing the argument here.

The people downvoting you have never experienced perfectly regular trees (and don't understand CrossCode inside jokes).

They explain it in the video. They already use algorithms to detect if things are buildings or not.

But if their algorithm can't make a determination or is uncertain below a certain threshold, they send it to Maptcha to get a bulk human opinion.

That's not the person I originally asked.

The person I asked actually did reply to me on this thread... but didn't answer how they know all this.

[–] ubergeek77@lemmy.ubergeek77.chat 11 points 6 months ago* (last edited 6 months ago) (4 children)

How do you know all of this?

I just want a reasonably priced generational bump over the Index. Most PCVR headsets that have pancake lenses are either obscenely priced, are ridiculously heavy, or have reportedly terrible QA. From what I've seen lately, usually all three are true.

You can get close in features and price with something like a Pico or a Quest, but they lack direct DisplayPort connection, so it's compressed wireless PCVR, compressed "wired" PCVR (which basically uses a networking protocol anyway), or no PCVR at all.

Myself, and I'm sure a ton of other people, are hoping for the Deckard to be "huge" for the PCVR market, just like the Index was when it released. Maybe we're all coping, and we probably even are, but I think a lot of people are generally unhappy with the state of the PCVR hardware market right now.

So all this is to say... I really hope this thing is much better than a glorified flat screen projector.

[–] ubergeek77@lemmy.ubergeek77.chat 13 points 6 months ago (1 children)

If you don't want to open this and solder a chip, then no, you can't do what you want.

The closest you can get is to enable AutoRCM, which will cause the Switch to always boot into recovery mode and accept a payload. This skips the need to use a jig in the Joy-Con rail, but you still need to inject a payload. And because recovery mode is just a black screen, you don't have any visual feedback to know if the Switch is actually in recovery mode, or if the battery is just dead.

Your best option is to just boot into whatever OS you use most, then make it a habit to keep it charged enough to not shut down.

 

In the past few days, I've seen a number of people having trouble getting Lemmy set up on their own servers. That motivated me to create Lemmy-Easy-Deploy, a dead-simple solution to deploying Lemmy using Docker Compose under the hood.

To accommodate people new to Docker or self hosting, I've made it as simple as I possibly could. Edit the config file to specify your domain, then run the script. That's it! No manual configuration is needed. Your self hosted Lemmy instance will be up and running in about a minute or less. Everything is taken care of for you. Random passwords are created for Lemmy's microservices, and HTTPS is handled automatically by Caddy.

Updates are automatic too! Run the script again to detect and deploy updates to Lemmy automatically.

If you are an advanced user, plenty of config options are available. You can set this to compile Lemmy from source if you want, which is useful for trying out Release Candidate versions. You can also specify a Cloudflare API token, and if you do, HTTPS certificates will use the DNS challenge instead. This is helpful for Cloudflare proxy users, who can have issues with HTTPS certificates sometimes.

Try it out and let me know what you think!

https://github.com/ubergeek77/Lemmy-Easy-Deploy

 

In the past few days, I've seen a number of people having trouble getting Lemmy set up on their own servers. That motivated me to create Lemmy-Easy-Deploy, a dead-simple solution to deploying Lemmy using Docker Compose under the hood.

To accommodate people new to Docker or self hosting, I've made it as simple as I possibly could. Edit the config file to specify your domain, then run the script. That's it! No manual configuration is needed. Your self hosted Lemmy instance will be up and running in about a minute or less. Everything is taken care of for you. Random passwords are created for Lemmy's microservices, and HTTPS is handled automatically by Caddy.

Updates are automatic too! Run the script again to detect and deploy updates to Lemmy automatically.

If you are an advanced user, plenty of config options are available. You can set this to compile Lemmy from source if you want, which is useful for trying out Release Candidate versions. You can also specify a Cloudflare API token, and if you do, HTTPS certificates will use the DNS challenge instead. This is helpful for Cloudflare proxy users, who can have issues with HTTPS certificates sometimes.

Try it out and let me know what you think!

https://github.com/ubergeek77/Lemmy-Easy-Deploy

 

I noticed my feed on Lemmy was pretty dry today, even for Lemmy. Took me a while to realize lemmy.ml has been going up and down all morning, and isn't federating new posts.

But, since this is all still federated, I can still create and read posts on other instances while I wait. Even this one! Any other service would just be unavailable completely right now.

I do miss the larger communities on lemmy.ml - asklemmy, memes, and I really wanted to watch the reddit fallout on /c/reddit. Maybe I'll look around for some good replacements for those. Open to suggestions!

view more: next ›