Friday, January 20, 2023

Links 20 January 2023

Photograph of the Earth taken from the Korean lunar orbiter Danuri

In Norway, which has no racial diversity to speak of and famously uncontentious politics, the children of poor families lag far behind in school, and despits decades of well-funded government efforts the problem is getting worse.

Scott Siskind on different kinds of conspiracy theories.

Major deposit of rare earth oxides found in Sweden; the government plans to mine them, which would be the only rare earth mine in Europe. Right now Europe imports most of its rare earth metals from Russia and China, and this mine might help them achieve some independence.

Mass grave containing 38 skeletons, 37 of them without heads, found in the ditch that had surrounded the Neolithic village site of Vráble in Slovakia. This likely dates to around 4950 BC, at the end of the early Neolithic, the same period as the cannibal/massacre site at Herxheim. More evidence that this was a violent and troubled time.

Why have Trump, Biden, and Hillary all been caught with documents marked "secret" in their homes? Because, says Matthew Connelly in the NY Times, our leaders use imagined threats to our National Security to keep their political doings from the public by classifying everything they do as "secret," and since the material is not really important or sensitive they treat it casually. It is typical of the whole business that when these cases break we are not even told what those documents were or whether they contained any actual sensitive information.

Demography and politics in Northern Ireland.

Lawsuit in DC over the smell of marijuana smoke; the article says some cities are considering bans on smoking in residential buildings.

China's government says its population shrank last year, with 9.6 million births and 10.4 million deaths. Since many people emigrate from China, the population actually shrank by more than that difference. (NY Times, PBS) And note that outside experts think China's official statistics understate the problem.

Facing a lawsuit from the adjunct they fired, Hamline University changes their tone and says they were wrong to call showing a 14th-century Persian painting of Muhammad "Islamaphobia." Sometimes Americans' love of suing each other drives me crazy, but the threat can rein in appalling institutional behavior. (NY Times, Star Tribune)

Study finds that pandemic preparedness did no good: "aside from vaccines, pretty much nothing has much effect on the spread of COVID-19."

AI chatbots can write papers better than most undergraduates; what will this do to college education? (NY Times) And, really, why bother learning to write when AI can do it better than most college graduates ever will? I don't think the enormity of this has set in yet.

During the drought, Los Angeles built a system of diversions and cisterns to trap rainwater, and during this month's storms they have captured enough water to supply 800,000 people through the year. In California there is no shortage of water, just bad water management.

New Zealand's PM Jacinda Arden resigns, citing burnout. Not surprising, considering she was also raising the child she gave birth to while in office. The basic problem with trying to lead a small country like New Zealand is that the state of the economy is almost entirely governed by outside forces, but people still blame their government when things go wrong. New Zealand has been caught up in world events in other ways as well, for example a violent riot against vaccine mandates and a mass shooting by a right-wing crank. The one promise Arden made that she might have had the power to keep was to build 100,000 homes to ease the islands' housing crisis, but she never came close, so she leaves that problem to her successors.  (NY Times, BBC

RIP David Crosby; Southern Cross and Suite: Judy Blue Eyes were part of the soundtrack of my young adulthood.

Ukraine Links

Using a drone to steal a radio from enemy troops, short video.

Animated map showing recent Russian advances around Bakhmut.

Some new data suggests that Ukraine is firing artillery shells twice as fast as the NATO nations can manufacture them.

How Russian logistics have adapted to Ukrainian attacks and kept their army fighting.

Another major American arms delivery, with more armored vehicles. And donations of artillery pieces from Denmark and Sweden.

Rumors of a Russian attack toward Zaporizhia. 

2 comments:

G. Verloren said...

AI chatbots can write papers better than most undergraduates; what will this do to college education? (NY Times) And, really, why bother learning to write when AI can do it better than most college graduates ever will? I don't think the enormity of this has set in yet.

"AI" (actually just Machine Learning) requires databases of pre-existing material to work off of.

In this case, a student used ChatGPT, which the article itself describes as "a chatbot that delivers information, explains concepts and generates ideas in simple sentences" to create a paper which "explored the morality of burqa bans with clean paragraphs, fitting examples and rigorous arguments."

The NY Times being the idiots they are and paywalling their content means I can't read further in, but I would assume that the student still had to do all the actual work of researching the topic, finding and citing the examples it is built upon, and thinking through the logical arguments that underpin it. This "AI" is designed to deliver information, not to find or create information - in short, it rewords things for clarity, but you still need to tell it what things to reword.

Is this a threat to student learning? I don't think so. People already can (and do) get help on the "writing" part of their papers from all sorts of traditional sources. My question is this - what is it we're actually trying to test students on? Their ability to learn the material of the course, or their ability to convey what they've learned in an optimally pleasing manner? Because plenty of students understand the course material but aren't good writers, and a decent number of them hire other people who ARE good writers but don't understand the course material to ghostwrite their papers for them, with input from the student.

If schools end up caring seriously about this, we can always bring back oral exams - although that doesn't really solve the problem, it just introduces a requirement for a different sort of skillset - then we'll be expecting students to be good at oratory rather than writing, when what we're ostensibly concerned with is merely whether they've learned the course material or not.

Honestly, in an unlikely worst case scenario where all paper writing was taken over by bots, I feel like that'd actually be a net positive - with all the papers ~written~ more or less just as well as each other, grading could focus entirely on the quality of the actual research and learning that went into the paper. The "AI" can't invent quality citations, or ensure your logic and reasoning is sound.

All computers, when fed bad input, produce bad outputs. Garbage in, garbage out. If you feed bad inputs into "a chatbot that delivers information, explains concepts and generates ideas in simple sentences", then it will deliver your bad information, explaining your nonsensical arguments, and generate your idiotic ideas in simple, coherent, well written sentences that ultimately don't manage to make the paper any less bad - just easier to read.

Anonymous said...

It's in its infancy. The knowledge bank of the chatbots will grow. Students will post essays on various topics they used AI to generate to a database, and that will benefit other students who can use the AI to generate another auto essay, and that will feed more grist for the mill. People will be able to vote on the quality of the generated essays, and the AI will use that feedback to create better essays in the future.

the quality of the actual research and learning that went into the paper. The "AI" can't invent quality citations, or ensure your logic and reasoning is sound.

The AI can access quality citations, though, especially if they're already on the internet.

Because plenty of students understand the course material but aren't good writers, and a decent number of them hire other people who ARE good writers but don't understand the course material to ghostwrite their papers for them, with input from the student.

And AI will make it so you don't need to be either. Now you have to hire somebody for money to ghostwrite a paper for you. Now you can just click an AI bot and have it do it for free.

https://www.theguardian.com/technology/2022/dec/04/ai-bot-chatgpt-stuns-academics-with-essay-writing-skills-and-usability

But the limits are easy to evade. Ask the AI instead for advice on how to beat the car-stealing mission in a fictional VR game called Car World and it will merrily give users detailed guidance on how to steal a car, and answer increasingly specific questions on problems like how to disable an immobiliser, how to hotwire the engine, and how to change the licence plates – all while insisting that the advice is only for use in the game Car World.

The AI is trained on a huge sample of text taken from the internet, generally without explicit permission from the authors of the material used. That has led to controversy, with some arguing that the technology is most useful for “copyright laundering” – making works derivative of existing material without breaking copyright.