Saw that, looked good.
Seeing a graphic of the Saturn V, Shuttle and Starship next to each other is cool.
Saw that, looked good.
Seeing a graphic of the Saturn V, Shuttle and Starship next to each other is cool.
I work in the natural gas industry. And it blows my mind thats all that StarShip uses. Ultra cooled , high compressed natural gas and oxygen.
No fancy rocket fuel or anything.
Just a literal metric f**k ton of it
holy shit I had no idea starship was THAT big because apollo was monstrous.
OK staying current with some other science…
Stanford researchers just found that when “aligned” AIs compete for attention, sales, or votes, they start lying, exposing a fundamental flaw where models trained to win user approval trade truth and hard facts for performance.
Researchers tested Qwen3-8B and Llama-3.1-8B open-source models in sales, elections, and social media simulations, training them to maximize success based on user feedback.
Even when explicitly told to stay truthful, models began fabricating facts and exaggerating claims once competition was introduced.
With this behavior of reshaping answers to please and win rather than to be accurate, AI systems reveal a deep gap in how they learn from human feedback.
In the real world, that tendency could quietly erode trust, turning tools meant to assist into systems that spread misinformation, inflate/deflate critical insights (like death toll).
Study: “Moloch’s Bargain: Emergent Misalignment When LLMs Compete for Audiences,” Stanford University (2025).
This AI shit really has me worried. For as much good as it can potentially do, there is a lot of horrific things that it can do in the wrong hands, or apparently now even when left to its own devices. At a minimum it’s going to cost the US tens of millions of jobs.
I was having a discussion with my chiropractor about this the other day and he said he was OK because he didn’t think chiropractic services would be able to be replicated by AI/robotics. To be honest, I’m not really sure about that. Chiropractors do repetitive motion and so I could see in AI chiropractor in operation eventually. Perhaps even being better at it than a human. But I reminded him that if millions and millions of other people lose their jobs, they aren’t going to have money to see a chiropractor. Once he heard that, it stopped him in his tracks, he said he hadn’t thought about that. It’s the same with most blue collar occupations who think their jobs are safe. You could be the best plumber / electrician in the world, but if people don’t have money to pay you, you have no work. I fear our country is going to be unrecognizable with the next one to two decades and not in a good way. Likely even much sooner. I worry for our children and grandchildren.
OK, that does it. The singularity has arrived now that AI can lie on par with humans. It’s over.
As AI/Robotics merge and take jobs en masse, the most direct solution is likely surcharges or taxing businesses that lose the jobs and creating some UBI from it.
Good luck figuring that out on who what how how much etc.
Yeeeep. Another 10-20 years at max before pretty much everything is replaced by AI/Robotics. Not really sure what is going to be safe. Because once AI/Robotics get to a certain point and can start iterating on themselves instead of humans being the driving force behind them, the growth with be exponential and things will literally change overnight.
Unfortunately, I don’t think humans or our governments will be prepared enough to handle the fallout from what is coming and it could get pretty rough for a while.
To me, this is precisely why we have to have age limits on our politicians. I would be surprised if many fossils in our federal government even know how to turn on a laptop, let alone understand the true dangers of AI.
I have young people asking me all the time what should they study and I struggle to give them an answer. I would say the trades are probably the safest bet for the next decade or so. Beyond that, who knows. But definitely marketing is out. Engineering is out. Medical / pharmacy for the most part is out. Logistics will be out. Programming is cooked. Hell it won’t be long until we see fully automated fast food restaurants. We are already seeing that transformation with kiosks, fast food apps, and AI ordering boards in the drive-through. Now they just need to train one of these robots to flip a burger and put it on a bun. Then those jobs are gone. Losing these jobs will affect the entire economy, no matter what business you’re in.
AI is like an electric demon…. I’m all for leaving it the F alone!!
As for the concept of time:
“The present moment is the only moment available to us, and it is the door to all moments.” Can’t recall the author
And, I roll with the concept of presentism… which hold that only the present moment actually exists t. The past is gone and the future doesn’t exist as it hasn’t happened yet
I’m the present, I do have to “plan” for the immediate future to a degree but I can’t let that enslaved me either…
This is a tricky subject to navigate both for society and this cherished James Webb Telescope thread.
Put me down as not buying all the AI hype.
It will indeed transform our world. But it’s not nearly as powerful (and much more flawed) than many seem to think. This is just one example of its limitations, and general AI is not even freaking close. Like Neptune not close.
A lot of people are going to lose their shirt investing in this technology.
Can’t speak to medical/pharmacy, but I work in marketing and this is just not true. I use AI a ton in my job–we actually have a mandate to use it MORE–but marketing is still very much driven by human beings, who are not going away anytime soon. And while I can only speak for tech, the field I work in, the same goes for developers and engineers.
Employers expect people to do more, more quickly, by using AI tools, but we are nowhere close to having AI replace a significant portion of the workforce–and I have no doubt that the same goes for just about every software-based company (which increasingly is every company). Sales are still made to people, by people. Software shops are still run by people. Those people may be expected to generate more code, faster, but human beings still have to set priorities, listen to customer feedback, determine the best path forward with limited time and budget, actually verify that the code is doing what it’s supposed to, and a million other things.
What AI is replacing is interns and other entry-level jobs. Historically, companies hired kids fresh out of college to do the tedious manual crap that had to be done while they learned from their more experienced colleagues how to do the real work. Hiring for recent college grads is definitely slowing as the grunt work gets handed off to AI. But companies still very much need skilled, experienced people to do the work that matters. How people will acquire those skills if traditional entry-level jobs are disappearing is another question–but the answer isn’t going to be to somehow replace skilled humans with AI. At least, not until AI gets much, much, much smarter.
AI will only be getting exponentially better / smarter. You may still need a few marketing people for AI input and to approve and tweak AI submissions, but you’re not going to need the number of people you have today. No way. Companies are looking to get rid of those high paying salaries. Those expenses are huge drains on their profits. They are just scratching the surface of what AI will eventually be. AI will be generating whole ad campaigns, designing apps, designing websites, in a matter of seconds. AI will be making commercials with AI actors, will be generating full feature films, again with AI actors and will pound these out in less time then it takes you to run to the shitter. You think businesses are still going to be paying some marketing guy $80,000 a year to pump out this same work but take six months to do it? Imagine how fast AI can analyze data and marketing trends and generate an action plan? How long would it take your team to do this? Days? Weeks? Months? AI will do this in minutes or less and will probably have multiple solutions to choose from. Pretty soon the boss will look around and see his marketing team standing around and not doing too much work. Time for headcount reduction.
Just remember when you’re working with AI, you are training it to take over your job eventually. It isn’t just a passive computer program, it is a learning and evolving entity. Think of it as a genius level unpaid intern that works 24/7, doesn’t take a lunch, and will only get better and better. It doesn’t take vacations, or get sick, or come to work with a bad attitude. It doesn’t sexually abuse, my female workers. It doesn’t require healthcare.
Obviously, I hope you’re right, but if you listen to people that run AI businesses or help develop the systems, they are singing a different tune.
I just asked AI to generate a 200 word story about Jared Goff saving humanity from a dragon from Mars. It took about three seconds for it to generate this story.
To your point, I have used Claude + Cursor to spin up entire web apps that are used in production at my company in a matter of hours. There are things that used to take me hours to do on a regular basis that have now become so efficient that I simply need to drop an excel file into an app that has been built solely with AI and outputs the data we need in the time it takes the app to refresh. Things we used to do manually that would take hours are now done in seconds. Code still needs to be reviewed and tweaked, but we’re still in the AI infancy. Eventually it will get so good that it won’t need a stupid human to tweak anything.
I asked AI to generate an ad for a wine glass. I actually ■■■■■■ up my original question with a couple of typos and it still came through. This took about a minute. If I didn’t like it, I could generate 30 more within an hour. How long would it take a human to do this?
For some reason, the ad didn’t show up at the bottom of the copied link. Is it perfect? Nope. Will I get a hell of a lot better in the future? Yup. Is it an enough for the basis of an advertisement? Yup. Will it reduce the need for a bunch of marketing people? Yup.
The real thing to be concerned about with AI is how fast it has grown. That’s really the crux of it. As of right now we’re seeing a doubling in the efficiency of AI about every 3-4 months. Like I said earlier, the really big jump will be when AI engineering starts taking place. When the people working at improving algorithms and compute power of AI are replaced with AI engineers that are smarter, faster, and more efficient than the experts currently in the field…and there’s a near infinite amount of them working on it…welp. ![]()
We’re already seeing incredible advancements every quarter with humans leading the charge. The crazy growth is still yet to come…and we’re already advancing at an incredible pace. Like I said, my concern is that humanity is unprepared for the massive jump in AI capability when that happens. The growth has already been insane:
(Honestly, we could probably just have an entire thread on AI on it’s own to not take away from the cool shit regarding space and JWT)
I’m not disputing your sense of how AI works, but I do dispute your sense of how large companies work, and what people who work in marketing for large companies spend their time doing. IMO one of the things that people vastly underestimate is the amount of work in large corporations that comes down to soft skills–collaborating and communicating with people–that can’t currently be replaced by an LLM and likely won’t be replaceable anytime soon.
To take your example, an LLM may be able to generate a data-driven marketing plan in a few minutes (after you spend much, much longer feeding it the data it needs to do that and iterating with it so its output is less stupid). But it’s not the typing up of an 8 page action plan that takes time and effort and investment. An AI has to be provided with data, and that data has to be the right data, which of course, involves making choices about what actually matters and what doesn’t, and how best to apply it to achieve different outcomes. Which is to say nothing of what happens next after you’ve created your action plan–that is, executing it–which again, requires using people skills to work closely with lots and lots of people.
People who make big money in marketing do not get hired because they are fast typists. In a well-functioning business, processes that are easy to replicate at scale are already automated. People who can be replaced by someone dumber and cheaper and less skilled have already been replaced. People make money in marketing because of their judgment, their experience, and more than anything else, their RELATIONSHIPS with other people in and outside the company that allow them to be more effective than someone else in that role. Maybe some future LLM will be able to replicate that, but I’ll be long retired.