I often run through the code in my head first, especially for things like binary search where it's easy to mess up the logic. It feels like a quick mental check, sometimes even faster than writing tests.
I'm curious though. When you're coding, do you actually pause and think through the logic first, or is it more like writing and fixing along the way?
This is such a fun idea. I never expected the terminal to have this kind of retro way to “blow up” emojis. Seeing a whole row of giant faces honestly made it feel like the terminal had emotions.
Now I kind of want to throw a giant warning emoji into a monitoring script. No way anyone’s ignoring that.
I’ve noticed a lot of AI agents start off doing pretty well, but the longer they run, the more they seem to drift. It's like they forget what they were supposed to do in the first place.
Is this just a context limitation, or are they missing some kind of self-correction loop? Curious if anyone has seen agents that can catch their own mistakes and adjust during a task. Would love to hear how far that has come.
Something I’ve been thinking about is whether it’s partly because people sometimes don’t really know how to describe what they’re feeling, so they end up putting those emotions onto objects. It kind of helps make sense of feelings that are hard to explain.
At the same time, I wonder if it’s always a good thing. Like, what happens if you lose or break something you’ve gotten really attached to? Could that make the anxiety worse?
Curious if anyone here has seen this or has any personal experience.
Yes, this is very common. Autistic people can easily go into meltdown if they loose an object that they assign emotional states to.
In severe cases it can be sufficient if the object is slightly moved to trigger a meltdown, and there are reports that support those exact thinking patterns.
I use AI a lot myself and it definitely makes getting information faster but it feels like something’s missing, like the fun of digging for the truth yourself. These AI tools can just give you the answers, which saves time, but it also takes away a lot of depth and variety. Without realizing it, we might also be losing our ability to think independently.
Do you think AI can really replace all the value traditional news brings?
Isn't this already the case but you can replace traditional news with personal investigation? What is another layer of indirection?
I recall going to a townhall vote on some legislation a company I was employed with at the time wanted vs what the Teamster Union wanted and both sides doing body double line rigging to get their viewpoint in during "open comments" but I couldn't find a single news article about the obvious tactic by both sides.
Do you think traditional news can really replace all the value personally verifiable data brings?
This technological breakthrough is truly amazing, especially the fact that the drone can fly on an actual race track independently, without relying on human control. It's really cool.
But honestly, as AI gets better at doing what we can do, even better in some cases, it makes me a little uneasy. Will there come a day when we truly become redundant, with AI taking over the work?
I'm not a fast typist either, but when I saw the author mention that daily typing practice helped with focus, I figured I'd try it. I started doing five minutes a day, kind of like clearing my head before work. Surprisingly, it actually helps.
I live in a pretty quiet neighborhood, and I’m not thrilled about the idea of drones flying over my backyard all the time. There’s something really jarring about sitting outside and suddenly hearing that loud buzzing overhead. I don’t think anyone’s really asked regular people how they feel about this kind of thing.
The NYPD drones buzz my neighborhood repeatedly during the summer months, and they are relatively small drones. Incredibly loud and distracting. I'm not looking forward to deliveries via heavier, louder drones.
Fucking with a Cop's toys is a reliably way to suddenly have a lot of trouble with the cops. It doesn't matter what the law says either, because unless you can find a prior case that specifically covers that law, a cop is not required to respect the law.
Delivery drones are larger and therefore quieter (larger propellor = more pleasant tone and lower RPM) and fly higher than the consumer drones you’re probably familiar with.
Sounds is a funny thing. My physics professor once had us calculate the energy in the sound of an entire stadium and it wasn't enough to heat a cup of coffee. Sound just doesn't have that much energy. For a few hundred watts you can make an entire neighborhood miserable as any car audio enthusiast can demonstrate.
I routinely have Apache helicopters fly over my house and I prefer them to most of the tiny drones. The helicopters typically have lower frequencies and fly way higher. They also don't have that insect flying by your head buzz that makes my ears hurt.
I was a bit skeptical at first too, but once I let AI dive into my code, run tests, refactor functions, and even catch its own bugs, it felt like I suddenly had a really competent teammate.
AI has been genuinely helpful for me , it saves time and lets me focus on the things that actually matter.
I've written things before that no one really read, but I still felt they were worth putting out there. Sometimes the only reason I hit publish is because it means something to me. Even if I'm the only one who reads it, that's enough.
I'm curious though. When you're coding, do you actually pause and think through the logic first, or is it more like writing and fixing along the way?