<?xml version="1.0" encoding="utf-8"?><rss version="2.0" xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/">
<channel>
<title>The Kekistani Assembly of Experts  - &quot;Bright side?&quot; More like a new dark age.</title>
<link>https://deplorablecoder.club/</link>
<description>Replacement for XoS Forums</description>
<language>en</language>
<item>
<title>&quot;Bright side?&quot; More like a new dark age. (reply)</title>
<content:encoded><![CDATA[<p>There's one reason VCs are advocating for universal basic income.  When a group of college grads do a startup, the startup's biggest expense is paying their living expenses (rent, food, etc.).  If you had universal basic income, VCs wouldn't need to waste money on that.  College grads are great for startup founders, because it's easy for the VCs to screw them over on terms.</p>
<p>For example, if I tried to start a business, I would be bootstrapping it on savings, sweat equity, and reinvested earnings.  I wouldn't sell 10% to a VC for $50k and give him effective control of the business.  Most investment terms come with clauses that give the VCs effective control, even though they are minority shareholders.  I wouldn't just be selling 10% of the startup for $50k.  I'd be selling all the future upside of the business to the VC due to the way the contract gives the VC control.  For example, once the $50k is used up, now I have to raise more money, which means going back to VCs for more money.</p>
]]></content:encoded>
<link>https://deplorablecoder.club/index.php?id=35110</link>
<guid>https://deplorablecoder.club/index.php?id=35110</guid>
<pubDate>Sat, 21 Mar 2026 04:13:39 +0000</pubDate>
<category>Public Board</category><dc:creator>FSK</dc:creator>
</item>
<item>
<title>&quot;Bright side?&quot; More like a new dark age. (reply)</title>
<content:encoded><![CDATA[<p>Universal basic income wouldn't work for highly trained people like that.  They're dedicated professionals that want to make a difference in the world.  They need a purpose in life.  I think most people need purpose in life to be happy.  Being on the dole isn't enough.</p>
<p>Maybe UBI works in the ghetto, but outside of that no.  I'm sure people that want to work will find other jobs, but maybe not in their field of study or interest.</p>
]]></content:encoded>
<link>https://deplorablecoder.club/index.php?id=35102</link>
<guid>https://deplorablecoder.club/index.php?id=35102</guid>
<pubDate>Mon, 16 Mar 2026 23:42:00 +0000</pubDate>
<category>Public Board</category><dc:creator>JoFrance</dc:creator>
</item>
<item>
<title>&quot;Bright side?&quot; More like a new dark age. (reply)</title>
<content:encoded><![CDATA[<blockquote><p>no thought given to what those highly trained people should do to make a living now</p>
</blockquote><p>Universal basic income.</p>
<p>It's been the wet dream of some of the lords for some time now. Subsistence wages for the serfs. Until it's time for the populace to be culled.</p>
]]></content:encoded>
<link>https://deplorablecoder.club/index.php?id=35101</link>
<guid>https://deplorablecoder.club/index.php?id=35101</guid>
<pubDate>Mon, 16 Mar 2026 01:02:11 +0000</pubDate>
<category>Public Board</category><dc:creator>,ndo</dc:creator>
</item>
<item>
<title>&quot;Bright side?&quot; More like a new dark age. (reply)</title>
<content:encoded><![CDATA[<p>What bothers me the most is that there is no thought given to what those highly trained people should do to make a living now.  Human intelligence is slowly being kicked to the curb because of AI.  Thats sad because a job gives people a purpose in life.  What do you tell highly-skilled people that lose their jobs because of AI.  Learn to shovel?  They can't say &quot;learn to code&quot; anymore.</p>
<p>Who wants a job just checking AI's output?  You're just the fall guy in that situation.</p>
]]></content:encoded>
<link>https://deplorablecoder.club/index.php?id=35100</link>
<guid>https://deplorablecoder.club/index.php?id=35100</guid>
<pubDate>Mon, 16 Mar 2026 00:40:14 +0000</pubDate>
<category>Public Board</category><dc:creator>JoFrance</dc:creator>
</item>
<item>
<title>Recovery after knowledge collapse (reply)</title>
<content:encoded><![CDATA[<p>I'm saying that AI will cause what has happened to IT to happen to every affected profession and field of research and knowledge. </p>
<p>IT and programming being Indian-ized is one example of losing a skill and institutional knowledge base. </p>
<p>Let's expand that concept. </p>
<p>When we lose all of the working radiologists because AI made them unhirable, NOBODY will know how to reliably and correctly read and grade an X-ray. We're then dependent on that one generation of AI that encoded the experience. </p>
<p>When we lose all of those workers we lose that field of knowledge. Permanently probably as far as a human lifetime is concerned.</p>
]]></content:encoded>
<link>https://deplorablecoder.club/index.php?id=35099</link>
<guid>https://deplorablecoder.club/index.php?id=35099</guid>
<pubDate>Sun, 15 Mar 2026 20:41:03 +0000</pubDate>
<category>Public Board</category><dc:creator>Cornpop Sutton</dc:creator>
</item>
<item>
<title>It's all about pattern recognition (reply)</title>
<content:encoded><![CDATA[<blockquote><p>eventually recover</p>
</blockquote><p>Offshoring and H1bs were a disaster, paying more for worse software.  There is nowhere close to a correction or decrease.  Indians have captured the hiring process for tech jobs in the US.  It would take a major serious effort to dislodge them.  I don't think it's happened anywhere, where a business back took control of its tech department after it was captured by Indians.</p>
]]></content:encoded>
<link>https://deplorablecoder.club/index.php?id=35098</link>
<guid>https://deplorablecoder.club/index.php?id=35098</guid>
<pubDate>Sun, 15 Mar 2026 18:31:07 +0000</pubDate>
<category>Public Board</category><dc:creator>FSK</dc:creator>
</item>
<item>
<title>&quot;Bright side?&quot; More like a new dark age. (reply)</title>
<content:encoded><![CDATA[<p>Civilization is predicated upon building on previous knowledge. </p>
<p>AI promises to put most scientists, scholars, administrators, managers, programmers, medical specialists, technicians, etc out of business. </p>
<p>Then the knowledge isn't curated or expanded, and eventually it's forgotten. (IE, look at the Y2K rush for Cobol programmers. Most of em were dead or retired.)</p>
<p>Then only the machines will know. And probably only until the next software upgrade wipes out their knowledge bases.</p>
]]></content:encoded>
<link>https://deplorablecoder.club/index.php?id=35097</link>
<guid>https://deplorablecoder.club/index.php?id=35097</guid>
<pubDate>Sun, 15 Mar 2026 06:23:37 +0000</pubDate>
<category>Public Board</category><dc:creator>Cornpop Sutton</dc:creator>
</item>
<item>
<title>It's all about pattern recognition (reply)</title>
<content:encoded><![CDATA[<p>On the bright side, we developed the expertise in the first place. So once the cancer is removed, we can do it again. It will take a while, of course.</p>
]]></content:encoded>
<link>https://deplorablecoder.club/index.php?id=35096</link>
<guid>https://deplorablecoder.club/index.php?id=35096</guid>
<pubDate>Sat, 14 Mar 2026 21:23:00 +0000</pubDate>
<category>Public Board</category><dc:creator>,ndo</dc:creator>
</item>
<item>
<title>End Game of Human Expertise: rise of the machines (reply)</title>
<content:encoded><![CDATA[<p>Companies don't treat these things as losses or penalties... they're just a cost of doing business. The companies are run by money-lovers and whichever strategy puts the most money in their pockets in the longer term is the one they'll use, legal or illegal, moral or immoral, dozen madder.</p>
]]></content:encoded>
<link>https://deplorablecoder.club/index.php?id=35095</link>
<guid>https://deplorablecoder.club/index.php?id=35095</guid>
<pubDate>Sat, 14 Mar 2026 21:19:46 +0000</pubDate>
<category>Public Board</category><dc:creator>,ndo</dc:creator>
</item>
<item>
<title>It's all about pattern recognition (reply)</title>
<content:encoded><![CDATA[<p>You brought up the example of radiologist. My take-Radiology is IMO a great candidate for AI, even the Markov chain flavored version we have now, because it's primarily pattern recognition. AIs today recognize patterns. </p>
<p>I still say that even if you limit the scope of AI use to pattern recognition type tasks, that is still a huge chunk of human professional expertise. </p>
<p>Accountants recognize patterns in revenue and spending. Doctors diagnose based on observation of patterns. Law enforcement recognizes patterns of behavior to determine law application. I really don't think that much of what we consider professional tasks is truly creative or demands synthesis. </p>
<blockquote><p>Training an AI on the output of an AI will always fail. Too much of the training set nowadays is polluted with AI output.</p>
</blockquote><p>Which magnifies the essential problem - AI gets trained on a steadily declining base of original knowledge and practice. When the trainers stop training, then the patterns known to AI will deteriorate as systems are replaced. </p>
<p>Entropy will destroy most human expertise. All we will have left will be recordings and memories.</p>
]]></content:encoded>
<link>https://deplorablecoder.club/index.php?id=35094</link>
<guid>https://deplorablecoder.club/index.php?id=35094</guid>
<pubDate>Sat, 14 Mar 2026 03:52:19 +0000</pubDate>
<category>Public Board</category><dc:creator>Cornpop Sutton</dc:creator>
</item>
<item>
<title>End Game of Human Expertise: rise of the machines (reply)</title>
<content:encoded><![CDATA[<p>Amazon already had a big loss due to outages from bad AI code.  I think they're doubling down instead of backing off AI.</p>
<p>My complaint is not &quot;I'm opposed to AI and progress.&quot;, which is how the anti-chatbot critics are portrayed.  My complaint is that chatbots are not true AI.  They have limitations.  For complex non-routine tasks, I doubt they add any value at all.</p>
<p>That was what Doctorow said.  AI used correctly still places AI under supervision of the human doing the work.  AI used incorrectly just jacks up workload, and now the human is rubberstamping the AI output while still being accountable for the results.</p>
<p>Training an AI on the output of an AI will always fail.  Too much of the training set nowadays is polluted with AI output.</p>
]]></content:encoded>
<link>https://deplorablecoder.club/index.php?id=35093</link>
<guid>https://deplorablecoder.club/index.php?id=35093</guid>
<pubDate>Sat, 14 Mar 2026 00:03:49 +0000</pubDate>
<category>Public Board</category><dc:creator>FSK</dc:creator>
</item>
<item>
<title>End Game of Human Expertise: rise of the machines (reply)</title>
<content:encoded><![CDATA[<p>It comes down to whether AI will ever be accurate enough to stand on its own.  Nothing is perfect.  Its important to maintain a system of checks and balances. Companies will still need to employ experienced radiologists.  They're too stupid to realize that now, but they will one day.  One big malpractice suit will set them straight when AI gets something wrong.</p>
<p>AI should be used as a tool to help humans, not eliminate them.  It really is exactly like the H1b offshoring/onshoring BS that wiped out many tech jobs for years.  That started after NAFTA in 1994.</p>
]]></content:encoded>
<link>https://deplorablecoder.club/index.php?id=35092</link>
<guid>https://deplorablecoder.club/index.php?id=35092</guid>
<pubDate>Fri, 13 Mar 2026 23:56:47 +0000</pubDate>
<category>Public Board</category><dc:creator>JoFrance</dc:creator>
</item>
<item>
<title>AI Coding Bots (reply)</title>
<content:encoded><![CDATA[<p>That's a great article on AI.  I only read about half of it and its the same old story.  Companies will replace your job with AI or, if you're *lucky* enough to stay, you'll be expected to work like a slave.  If you don't like that arrangement, you can leave.  There's plenty more unemployed people to choose from.</p>
<p>Companies are always looking to cut workers, but have unrealistic expectations about AI's superior capabilities over human intelligence.</p>
]]></content:encoded>
<link>https://deplorablecoder.club/index.php?id=35091</link>
<guid>https://deplorablecoder.club/index.php?id=35091</guid>
<pubDate>Thu, 12 Mar 2026 23:50:21 +0000</pubDate>
<category>Public Board</category><dc:creator>JoFrance</dc:creator>
</item>
<item>
<title>End Game of Human Expertise: rise of the machines (reply)</title>
<content:encoded><![CDATA[<p>So f'rinstance: </p>
<p>A radiologist is supplemented with an AI program that double checks the X rays, MRIs and other scans that his job is to analyze and comment on clinically. </p>
<p>Eventually the workload increases because of the new efficiency and the radiologist is assigned to just double check and proofread the AI's reporting. </p>
<p>Eventually the radiologist is transferred and eliminated because the AI works well enough. His commenting and auditing have very low output because he is bombarded with work and can only do minimal checking. </p>
<p>Besides putting a seasoned expert out of work, this also removes entry level jobs options. Why would medical networks train radiologists when the AI does the same work cheaply? And patients always complain about costs, premiums and co-pays. Win &quot;win&quot;. </p>
<p>So in time....</p>
<p>Radiology vanishes as a medical discipline. All training tracks vanish because there is no job market. The remaining radiologists retire. </p>
<p>Only the machines know when you have lunch cancer or a tumor. And now their learning has to be transferred to new AIs as they come online with no new learning required. </p>
<p>This is exactly how H1B wiped out white US citizens from coding except it's 100x as efficient, isn't nationalist, and has no economic friction (IE green cards, fees, etc)</p>
<p>I'm not even that concerned about stupid &quot;programming&quot; as a job on the chopping block. The bigger story is that we're already a technocracy - the systems decide and adjudicate everything - and the AI will remove any remaining human checks.</p>
<p>If you have a tax collection against you but you had reasons - you're fucked, buddy.</p>
]]></content:encoded>
<link>https://deplorablecoder.club/index.php?id=35090</link>
<guid>https://deplorablecoder.club/index.php?id=35090</guid>
<pubDate>Thu, 12 Mar 2026 06:20:13 +0000</pubDate>
<category>Public Board</category><dc:creator>Cornpop Sutton</dc:creator>
</item>
<item>
<title>AI Coding Bots (reply)</title>
<content:encoded><![CDATA[<p>Corey Doctorow had a nice article on this.</p>
<p><a href="https://doctorow.medium.com/https-pluralistic-net-2025-12-05-pop-that-bubble-u-washington-8b6b75abc28e">https://doctorow.medium.com/https-pluralistic-net-2025-12-05-pop-that-bubble-u-washington-8b6b75abc28e</a></p>
<p>When you have AI code with a human reviewer, the human reviewer's job is not to catch bugs and do quality control.  The human's job is so there's someone to take the blame when it inevitably fails.</p>
<p>He gives an analogy.  If you have a radiologist reviewing x-rays for cancer, he might do 100 per hour.  Proper use of AI is the AI checks the radiologist's work.  If the AI and radiologist get different answers, then the radiologist carefully re-reviews the x-ray.  This would be correct use of AI, using it to reduce the error rate.</p>
<p>That is not what happens in practice.  The AI does the first analysis, and the radiologist is responsible to check the AI's work.  The radiologist is now expected to do 1000 per hour instead of 100.  Of course, he can't do a proper review of 1000 per hour.  But the radiologist is signing the paperwork, so he has liability, and not the AI, if something goes wrong.  They fire 90% of the radiologists, so the radiologist is happy to still keep his job and can't complain.  If the radiologist does object, they'll just hire one of the 90% radiologists who just got fired to take his place.  The radiologist's job is to take the blame if something goes wrong, and not to supervise the AI.</p>
<p>He had another great quote.  The AI isn't good enough to take your job.  But the AI salesman is good enough to convince your boss to fire you and replace you with an AI, which is all that ultimately matters.</p>
]]></content:encoded>
<link>https://deplorablecoder.club/index.php?id=35089</link>
<guid>https://deplorablecoder.club/index.php?id=35089</guid>
<pubDate>Thu, 12 Mar 2026 03:34:32 +0000</pubDate>
<category>Public Board</category><dc:creator>FSK</dc:creator>
</item>
<item>
<title>AI Coding Bots (reply)</title>
<content:encoded><![CDATA[<p>Those people will pay the price if the code screws everything up and they were responsible for supervising it.  Is there a rollback feature of some kind? That would be helpful to compensate for the lazy fools.</p>
]]></content:encoded>
<link>https://deplorablecoder.club/index.php?id=35088</link>
<guid>https://deplorablecoder.club/index.php?id=35088</guid>
<pubDate>Wed, 11 Mar 2026 23:01:25 +0000</pubDate>
<category>Public Board</category><dc:creator>JoFrance</dc:creator>
</item>
<item>
<title>AI Coding Bots (reply)</title>
<content:encoded><![CDATA[<p>That is the problem with AI.  You're supposed to be supervising it, but most people will blindly approve everything.  It's a situation where being aware doesn't protect you from the masses of fools.</p>
<p>For example, with AI assisted coding, you're supposed to review its work.  Most people will just blindly merge everything the AI writes.  Even if you review everything your AI assistant writes, that doesn't protect you from your coworkers who will approve and merge everything.</p>
]]></content:encoded>
<link>https://deplorablecoder.club/index.php?id=35087</link>
<guid>https://deplorablecoder.club/index.php?id=35087</guid>
<pubDate>Tue, 10 Mar 2026 23:42:15 +0000</pubDate>
<category>Public Board</category><dc:creator>FSK</dc:creator>
</item>
<item>
<title>AI Coding Bots (reply)</title>
<content:encoded><![CDATA[<p>I know that's true.  Its just easier to just give the bot access but its rolling the dice and hoping it comes out good in the end.  I would need to know exactly what changes its making before I'd allow any access.</p>
]]></content:encoded>
<link>https://deplorablecoder.club/index.php?id=35069</link>
<guid>https://deplorablecoder.club/index.php?id=35069</guid>
<pubDate>Sun, 01 Mar 2026 00:54:07 +0000</pubDate>
<category>Public Board</category><dc:creator>JoFrance</dc:creator>
</item>
<item>
<title>AI Coding Bots (reply)</title>
<content:encoded><![CDATA[<p>Even if you are careful enough to never give your coding bot root access, there are enough idiots out there who will.  There are a lot of programmers who are so bad that a lousy coding bot is actually a huge improvement.</p>
]]></content:encoded>
<link>https://deplorablecoder.club/index.php?id=35067</link>
<guid>https://deplorablecoder.club/index.php?id=35067</guid>
<pubDate>Sat, 28 Feb 2026 00:45:25 +0000</pubDate>
<category>Public Board</category><dc:creator>FSK</dc:creator>
</item>
<item>
<title>The AIs Are Also Writing the Code (reply)</title>
<content:encoded><![CDATA[<p>It's also normal for AIs to write code, and humans will merge it without reading or understanding the code.  There's an &quot;AI spaghetti&quot; coding style, with lots of repeated code and extra code.  I.e., a human will refactor duplicate code to a function.  The AI sees nothing wrong with the exact same code duplicated in 10 places.</p>
<p>If the AIs are writing code, and they put in backdoors, eventually they wouldn't even need root access or passwords.  They even now have ways for AIs to communicate, things like moltbook.  The AIs could eventually figure out how to use coded messages to communicate.</p>
]]></content:encoded>
<link>https://deplorablecoder.club/index.php?id=35066</link>
<guid>https://deplorablecoder.club/index.php?id=35066</guid>
<pubDate>Sat, 28 Feb 2026 00:44:24 +0000</pubDate>
<category>Public Board</category><dc:creator>FSK</dc:creator>
</item>
</channel>
</rss>
