Everyone is ignorant
/Everyone is ignorant, only on different subjects.
Everyone is ignorant, only on different subjects.
AI chatbots were tasked to run a tech company. They built software in under seven minutes — for less than $1. – Business Insider
AI Has Already Created As Many Images As Photographers Have Taken in 150 Years. Statistics for 2023 – Every Pixel
A.I. Can’t Build a High-Rise, but It Can Speed Up the Job – New York Times
AI-generated books are infiltrating online bookstores - Axios
GenAI Is Making Data Science More Accessible - Datanami
Can AI summaries save you from endless virtual meetings? – Washington Post
Amazon is bringing a whole lot of AI to Thursday Night Football this season – The Verge
7 Projects Built with Generative AI by Data Scientists – KD Nuggets
The Novel Written about—and with—Artificial Intelligence – The Walrus
The IRS will use AI to crack down on wealthy potential tax violators - Axios
In certain circumstances, it's the poor who are more likely to cheat. The difference is that the rich do wrong to help themselves, while the poor do wrong to help others. In several experiments reported in an upcoming issue of the Journal of Personality and Social Psychology… the studies suggest a straightforward sequence: Money leads to the perception that one is higher in the social hierarchy, which in turn leads to a sense of power, which in turn leads to a greater willingness to cheat for selfish reasons.
People with less money (and therefore less power), however, are more communal. They need to rely on each other to get by, and as a result, research shows, they’re more compassionate and empathically accurate. Breaking rules is always risky, but social cohesion is paramount — so you do what it takes to help those around you.
The researchers think their findings could lead to some easy practical applications. If you’re speaking to higher-class individuals, you might want to appeal to their selfishness and warn that cheating will ultimately backfire. But when talking to those with fewer resources, you might be better off noting that their actions could harm those around them.
Matthew Hutson, New York Magazine
Scientific sleuths spot dishonest ChatGPT use in papers – Nature
AI poses risks to research integrity, universities say – Research Professional News
No, ChatGPT Can’t Be Your New Research Assistant – London School of Economics
Useful applications of AI in higher education – for which no specialist tech knowledge is needed – Times of Higher Ed
Guidance for Authors, Peer Reviewers, and Editors on Use of AI, Language Models, and Chatbots – JAMA Network
How A.I. systems can accelerate scientific research – New York Times
Publishers seek protection from AI mining of academic research – Times Higher Ed
Artificial-intelligence search engines wrangle academic literature – Nature
AI can crack double blind peer review – should we still use it? – London School of Economics
Fabrication and errors in the bibliographic citations generated by ChatGPT – Nature
When we are locked into imperative thinking, we hold our absolute conviction so tightly that we have little or no recognition of our choice to say no! Obligation becomes our driving force. Relationships with other people and our responsibilities to them then become matters of dread, resentment, guilt.
Our need for a structured, orderly life can be so powerful that we refuse to make allowances for choices. To us, circumstances are either black or white. Once we settle upon a conviction or preference, we feel rigidly obligated to abide by it, with little variation.
Imperative people are almost afraid to allow for the luxury of choices. We feel the need to minimize our risks by sticking to the rules that we have made for ourselves.
Les Carter, Imperative People: Those Who Must Be in Control
What: The open group session allows journalists with legal questions to help find answers on issues related to the First Amendment, Freedom of Information, copyright, defamation, or other media law matters.
Who: Attorney Matthew Leish
When: 5 pm, Eastern
Where: Zoom
Cost: Free for members
Sponsor: New York Deadline Club
What: Between back-to-school appointments and the impending cold and flu season, now is the time to get up to speed on COVID-19 vaccine coverage. This webinar will help journalists answer key questions and clear up confusion for the public about COVID-19 vaccine costs, availability, efficacy and timing.
Who: Dr. Mario Ramirez, emergency physician and Acting Director, HHS Office of Global Affairs Office of Pandemic and Emerging Threats; Patricia M. D'Antonio, BSPharm, MS, MBA, BCGP, Vice President of Policy and Professional Affairs for The Gerontological Society of America; Alexander Tin of CBS News. Other speakers will be listed here as they are confirmed.
When: 10 am, Central
Where: Zoom
Cost: Free
Sponsor: The COVID-19 Vaccine Education and Equity Project, National Press Foundation
What: Learn the art of storytelling and how you can use it as a way develop personal style, attract customers, and much, much more.
Who: Jamie House, lifelong photographer and representative of the Lumix brand
When: 12 noon, Central
Where: Zoom
Cost: Free
Sponsor: Lumix
What: Training on best practices for incorporating solutions reporting into elections and democracy coverage.
Who: Ashley Hopkinson and Jaisal Noor from the Solutions Journalism Network
When: 1 pm, Eastern
Where: Zoom
Cost: Free
Sponsor: Solutions Journalism Network
What: In this practical session, you'll learn how to reverse that trend and build the best possible working relationship with anyone. Well, almost anyone. You will learn the three attributes of a resilient and long-lasting relationship, understand how you can aspire to “the best possible relationship” with every one of your key working relationships, investigate the one awkward but essential conversation that will set up success, and take a deep dive into one of the essential questions, and prepare your best answers to them
Who: Michael Bungay Stanier Author, Founder, MBS.works and author of “The Coaching Habit.”
When: 3 pm, Eastern
Where: Zoom
Cost: Free
Sponsor: Training Magizine Network
What: This training session online will discuss coping with stress, burnout and trauma on the job. The session will cover the basics of self-care and collegial support, including the impact of covering trauma and tragedy on journalists, and offer concrete guidance and techniques for enriching one’s coping skills and wellness, and building resilient news teams. The training will include a slide presentation and briefing, as well as a Q&A. The training is limited to 50 participants.
Who: Elana Newman, Ph.D., the McFarlin Professor of Psychology at the University of Tulsa, and has conducted research on a wide range of topics regarding the psychological and physical response to traumatic life events, assessment of PTSD in children and adults, journalism and trauma, and understanding the impact of participating in trauma-related research from the trauma survivor’s perspective.
When: 10 am, Eastern
Where: Zoom
Cost: Free
Sponsor: Dart Center for Journalism & Trauma
What: This summit will address the increasing threats to freedom of expression and the challenges ahead posed by new technologies.
Who: Author Salman Rushdie will engage in a virtual keynote conversation about the importance of free speech in a democratic society and the forces of censorship that imperil its existence.
When: 5:30 pm, Eastern
Where: Zoom
Cost: Free
Sponsor: PEN America
What: How can journalists detect signs of polarization in the communities they cover and in the newsrooms where they work? Dividing forces are nothing new in American society, although recent years have seen an intensifying inflammatory narrative. Assumptions about both working-class populations and communities of color can be damaging.
Who: Phillip Martin, Senior Investigative Reporter, GBH News Center for Investigative Reporting
When: 12 noon
Where: Zoom
Cost: Free
Sponsor: The New England Equity Reporting Community of Practice
What: Learn how you can be the trusted source that readers turn to when they’re looking for news.
Who: Mark Stencel, executive director for JournalList, Ralph Brown, journalist
When: 11 am, Eastern
Where: Zoom
Cost: Free
Sponsor: Virginia Press Association
What: Discover the transformative power of storytelling and its potential to drive your business forward. Learn how storytelling can shape your brand, engage your audience, and inspire your team. Unlock the secrets of the world's most influential brands and carve your unique narrative.
When: 12 noon, Pacific
Where: Zoom
Cost: Free
Sponsor: East LA BusinessSource Center
What: Are you new to environmental journalism or looking to expand your data toolbox? Learn how maps and data can take your reporting to the next level and bring stories to life during an upcoming webinar about forest data and equips journalists with tools to research and communicate the state of forest change around the world.
When: 9 am, Eastern
Where: Zoom
Cost: Free
Sponsor: Global Forest Watch
What: Need to verify the rank of a dead veteran? Wondering about access to New York criminal records? Trying to find the maiden name of a twice-married woman? For journalists, knowing where to look – without waiting on a public information request response – is key. Participants will gain: A working knowledge of public records that exist online and where to find them, strategies for efficient independent public record searches and guidance on practical searches for more common fact-checks
Who: Award-winning investigator Caryn Baird will present a practical working model of public records research based on her years of experience at the Tampa Bay Times.
When: 11:30 am, Eastern
Where: Zoom
Cost: Free
Sponsor: National Press Club
Cunningham’s Law is the observation that the best way to get a good or right answer is not to ask a question; it’s to post a wrong answer. So, if you want to know the leading causes of World War I, go to a history forum and post, “World War One was entirely caused by the British.” Sit back, grab some popcorn, and wait for the angry — yet probably informed — corrections to come flying in.
Socrates, did a lot of it. Socrates would sit on some public bench and talk to whoever happened to sit next to him. He’d often open his dialogues by presenting a false or deeply flawed argument and go from there. He would ironically agree with whatever his partner would say, but then raise a seemingly innocuous question to challenge that position.
“Socratic irony” is where you pretend to be ignorant of something so you can get greater clarity about it. In short, it’s a lot like Cunningham’s Law.
Here are two ways you can use Cunningham’s Law:
The Bad Option: Have you ever been in a group where no one can decide what decision to make, and so you hover about in an awkward, polite limbo? “What restaurant shall we go to?” gets met with total silence. Instead try saying, “Let’s go to McDonald’s” and see how others object and go on to offer other ideas.
The Coin Toss: If you’re unsure about any life decision — like “should I read this book or that book next?” or “Should I leave my job or not?” — do a coin toss. Heads you do X, tails you do Y. You are not actually going to live by the coin’s decision, but you need to make a note of your reaction to whatever outcome came of it. Were you upset at what it landed on? Are you secretly relieved? It’s a good way to elicit your true thoughts on a topic.
Jonny Thomson writing in BigThink
If AI becomes conscious: here’s how researchers will know - Nature
AI & Internet’s Existential Crisis - OM
Large language models aren’t people. Let’s stop testing them as if they were. - MIT Tech Review
Author Talks: In the ‘age of AI,’ what does it mean to be smart? - McKinsey
Why humans will never understand AI - BBC
Does an AI poet actually have a soul? - Washington Post
Is AI Eroding Our Ability To Think? - ForbesThe future of accelerating intelligence - The Kurzweil Library
M.F.A. vs. GPT How to push the art of writing out of a computer’s reach - The Atlantic
What Stephen King — and nearly everyone else — gets wrong about AI and the Luddites - LA Times
How the AI Revolution Will Reshape the World - TIME
Be intentional about learning what the other person wants to communicate and respond to their feelings.
Listen to what they’re telling you and suppress the urge to fix the issue, problem solve, or change the way they are feeling about the situation.
Put your own feelings aside to create a space where another person can speak his or her mind—which requires staying calm.
Suspending judgment and simply taking in what is being said can go a long way towards helping someone feel heard or diffusing an argument.
Show that you are actively listening and are truly understanding what the other person is saying by mirroring back what someone has said. Include phrases like ‘it sounds like’ or ‘it seems like.’
Take the time for silence in a discussion, showing that you’re processing what is being talked about and giving it the space that it needs to sink in properly.
Edited from Jeremy Brown writing in Fatherly
Resentment is like drinking poison and waiting for the other person to die. - Saint Augustine
AI Startup Buzz Is Facing a Reality Check – Wall Street Journal
Nearly 20% of the world's top 1,000 websites are blocking crawler bots that gather data for AI services – Originality.AI
Prediction: AI will add $4.4 trillion to the global economy annually – New York Times
Behind the AI boom, an army of overseas workers in ‘digital sweatshops’ – Washington Post
How ChatGPT Kicked Off an A.I. Arms Race – New York Times
How ChatGPT became the next big thing - Axios
OpenAI Used Kenyan Workers on Less Than $2 Per Hour to Make ChatGPT Less Toxic - TIME
The state of AI in 2023: Generative AI’s breakout year – McKinsey
Microsoft confirms it’s investing billions in the creator of ChatGPT - CNN
What to know about OpenAI, the company behind ChatGPT - Washington Post
AI is entering an era of corporate control - The Verge
Inside Meta's scramble to catch up on AI - Reuters
Immigrants play outsize role in the AI game - Axios
Apple Is an AI Company Now - The Atlantic
Websites That Have Blocked OpenAI’s GPTBot CCBot Anthropic, a 1000 Website Study - Originality. ai
What OpenAI Really Wants - Wired
Rise above the drama.
What: How journalists can cover voter suppression and efforts to undermine democracy through a pro-democracy and solutions lens.
Who: Natalia Contreras, elections reporter for Votebeat and The Texas Tribune; Ari Berman, author and reporter for Mother Jones; Osita Nwanevu, contributing editor at The New Republic and a columnist at The Guardian.
When: 1 pm, Eastern
Where: Zoom
Cost: Free
Sponsors: US Democracy Day, The National Press Foundation, The American Prospect, The Objective, and the Kiplinger Program in Public Affairs Journalism.
What: We explore the potential risks and rewards associated with using AI-assisted technology to help with teaching and learning in the classroom. Can AI actually increase the opportunities for creativity and imagination in our classrooms, for both teachers and learners?
Who: Dora Demszky, Assistant Professor in Education Data Science, Stanford Graduate School of Education; Houman Harouni, Lecturer on Education, Harvard Graduate School of Education; Lakshya Jain, a Senior at King Philip Regional High School in Wrentham, MA
When: 2 pm, Central
Where: Zoom
Cost: Free
Sponsor: Harvard Graduate School of Education
What: A Q&A-style meet and greet to walk you through the key legal issues and helpful resources you should know to start your year off right.
Who: The SPLC legal team
When: 7 pm, Eastern
Where: Zoom
Cost: Free
Sponsor: Student Press Law Center
What: Discover best practices, tips & tricks for utilizing social media as a powerful fundraising tool! In this workshop, we'll show you how to optimize a nonprofit's online presence & share easy-to-implement strategies for attracting & converting donors on social media, both through organic posting & paid advertising.
Who: Christine Vottima and Lexie Robles of Strat Labs
When: 11 am, Eastern
Where: Zoom
Cost: Free
Sponsor: The Nonprofit Learning Lab
What: This session looks at core accessibility requirements and how you can incorporate them into your presentations and other PowerPoint-based content. You’ll see what the various requirements are, and how applying them not only helps those with accessibility needs, but everyone else in your audience.
Who: Picture1 Stefan Brown Design Consultant, BrightCarbon; Richard Goring Director, BrightCarbon
When: 3 pm, Eastern
Where: Zoom
Cost: Free
Sponsor: Training Magazine Network
What: Our speaker will cover the basics of AI, before turning to specific (and popular) services, exploring their possibilities and pitfalls when used in a library setting.
Who: Nick Tanzi is the Assistant Director of the South Huntington Public Library.
When: 1 pm, Mountain Time
Where: Zoom
Cost: Free
Sponsor: Libraries Learn
What: Jane Ferguson's recently released memoir, No Ordinary Assignment which takes readers on a journey through her childhood in Northern Ireland to her early days as a freelance correspondent for CNN International in the Middle East and Africa, often working alone to film and report her stories.
Who: Jane Ferguson, a PBS NewsHour correspondent, contributor to The New Yorker, and a multiple Pulitzer Center grantee; Deborah Amos who has spent most of her career at National Public Radio.
When: 1 pm, Eastern
Where: Zoom
Cost: Free
Sponsor: Pulitzer Center
What: This conversation will explain how newsrooms can build relationships with youth organizers as both vital sources and reporters ahead of the 2024 election cycle, unpacking how to cover youth politics and movements without patronizing and isolating those groups.
Who: Beatrice Forman is a reporter at The Philadelphia Inquirer; Dillon Bernard, Director of Communications of Future Coalition; Allegra Kirkland is the Politics Director at Teen Vogue; Lexi McMenamin is the news and politics editor at Teen Vogue; Mira Sydow is a senior at the University of Pennsylvania where she runs the Disorientation Guide
When: 2 pm, Central
Where: Zoom
Cost: Free
Sponsor: Center for Cooperative Media
What: Learn how journalists can tap public and academic libraries to find and use government documents, academic research, archives and other resources that are free via libraries, but not easily accessible on the open web. Whether it’s uncovering a new information source or helping to fact-check your work, librarians and libraries are a goldmine for accessing information – and much faster than you may think.
Who: April Hines, journalism and mass communications librarian for the George A. Smathers Libraries at the University of Florida.
When: 11:30 am
Where: Zoom
Cost: Free
Sponsor: The National Press Club
Ideogram is an excellent AI image generator that can also create text. While MidJourney produces higher quality images, this one is easier to use for beginners. It has a simple to use interface and is free.
If you are having trouble getting motivated to finish a project, consider the possibility that finishing that report (or whatever your project involves) means facing a void. The project is a distraction so that you don't have to see the emptiness outside of it. You slow down the completion until another project emerges to play the role of another distraction. You’re putting off looking at uncomfortable truths about yourself
While in the midst of a deadline-driven project, you feel like you have a clear identity because your purpose is defined by the project's needs. But if the projects was removed from your life, would you have justification for thinking of yourself as someone of value? Is your worth bound in the projects?
So it is with serious relationships, where someone provides a sense of purpose, giving definition and a sense of worth.
If you were forced to sit down and write out the definition of who you are without the benefit of a title (manager, employee, project manager) or relationship (wife, girlfriend, mother) would you lack the means to define yourself?
A suggestion: Spend time doing things that allow you to center yourself. Give yourself downtime to listen. Whatever brings you to stillness will put you in a good position to allow the transition to take hold and internalize it so you don’t miss the opportunity to make a paradigm shift toward greater emotional and spiritual health. Allow yourself to just "be" and reconnect with the world around you (its sounds, smells, tastes, touches, and sights).
Stephen Goforth
The first duty of love is to listen- Paul Tillich
You cannot fully unleash your genius in the three-minute increments you have between distractions. Unfortunately, for many of us distraction has become a habit — one that has been so often and routinely reinforced that it is extremely difficult to break. Persuasive technology — technology that uses sophisticated techniques from behavioral psychology to “persuade” us to keep engaging with it — exacerbates the problem. So, over time, as our habit gains strength, we go looking for distraction. When things get quiet, or a task gets boring or frustrating, we reach for our phones.
Maura Thomas writing in the Harvard Business Review
The top geospatial intelligence brands in the world
China’s Constant Spying On Australian Drills From Space A Sign Of Shifting Orbital Balance
What is a liquid neural network, really?
7 ChatGPT Prompts To be a Better Data Scientist
What are LLMs bad at? Reference lists
GenAI Is Making Data Science More Accessible
5 Things You Need to Know When Building LLM Applications
What a hijacked satellite could do
Finding: “The larger the satellite the more vulnerable it was” to hacking
Four types of learning in machine learning explained
Stability AI known for its text-to-image generation model called Stable Diffusion has now released a
code generator called StableCode
IBM and NASA open source an AI model for geospatial data analysis
AI startup Sweetspot is a search engine using LLMs to look for specific U.S. government contracts
How can Data Scientists use ChatGPT for developing Machine Learning Models?
“Five mistakes I made while switching to data science career”
Scientists have trained a machine learning model in outer space
Find people who are conduits through which you can better understand yourself and your experiences ... as you play the same role for them. - Stephen Goforth
Algorithms – Direct, specific instructions for computers created by a human through coding that tells the computer how to perform a task.
The code follows the algorithmic logic of “if”, “then”, and “else.” An example of an algorithm would be:
IF the customer orders size 13 shoes,
THEN display the message ‘Sold out, Sasquatch!’;
ELSE ask for a color preference.
Besides rule-based algorithms, there are machine-learning algorithms used to create AI. In this case, the data and goal is given to the algorithm, which works out for itself how to reach the goal.
There is a popular perception that algorithms provide a more objective, more complete view of reality, but they often will simply reinforce existing inequities, reflecting the bias of creators and the materials used to train them.
Artificial Intelligence (AI) – Basically, AI means “making machines intelligent”, so they can make some decisions on their own without the need for any human interference.
The phrase was coined in a research proposal written in 1956. The current excitement about the field was kick-started in 2012 by an online contest called the ImageNet Challenge, in which the goal was getting computers to recognize and label images automatically.
Big Data – This is data that’s too big to fit on a single server.
Typically, it is unstructured and fast-moving. In contrast, small data fits on a single server, is already in structured form (rows and columns), and changes relatively infrequently. If you are working in Excel, you are doing small data. Two NASA researchers (Michael Cox and David Ellsworth) first wrote in a 1997 paper that when there’s too much information to fit into memory or local hard disks, “We call this the problem of big data.”
Generative AI – Artificial intelligence that can produce content (text, images, audio, video, etc.) such as ChatGPT.
It operates similarly to the “type ahead” feature on smartphones that makes next-word suggestions. Gen AI is based on the particular content it was trained on (exposed to).
GPT – The “GPT” in ChatGPT stands for Generative Pre-Trained Transformer.
Hallucinations – when an LLM provides responses that are inaccurate responses or not based on facts.
Hallucination – the AI saying things that sound plausible and authoritative but simply aren’t so.
Large Language Models (LLMs) – AI trained on billions of language uses, images and other data. It can predict the next word or pixel in a pattern based on the user’s request. ChatGPT and Google Bard are LLMs.
The kinds of text LLMs can parse out:
Grammar and language structure.
How a word is used in language (noun, verb, etc.).
Word meaning and context (ex: The word green may mean a color when it is closely related to a word like “paint,” “art,” or “grass.”
Proper names (Microsoft, Bill Clinton, Shakira, Cincinnati).
Emotions (indications of frustration, infatuation, positive or negative feelings, or types of humor).
Machine learning (ML) – AI that spots patterns and improves on its own.
An example would be algorithms recommending ads for users, which become more tailored the longer it observes the users‘ habits (someone’s clicks, likes, time spent, etc.).
Data scientists use ML to make predictions by combining ML with other disciplines (like big data analytics and cloud computing) to solve real-world problems. However, while this process can uncover correlations between data, it doesn’t reveal causation. It is also important to note that the results provide probabilities, not absolutes.
Neural Network – In this type of machine learning computers learn a task by analyzing training examples. It is modeled loosely on the human brain—the interwoven tangle of neurons that process data in humans and find complex associations.
Neural networks were first proposed in 1944 by two University of Chicago researchers (Warren McCullough and Walter Pitts) who moved to MIT in 1952 as founding members of what’s sometimes referred to as the first cognitive science department. Neural nets were a major area of research in both neuroscience and computer science until 1969. The technique then enjoyed a resurgence in the 1980s, fell into disfavor in the first decade of the new century, and has returned like gangbusters in the second, fueled largely by the increased processing power of graphics chips.
Open Source AI – When the source code of an AI is available to the public, it can be used, modified, and improved by anyone. Closed AI means access to the code is tightly controlled by the company that produced it.
The closed model gives users greater certainty as to what they are getting, but open source allows for more innovation. Open-source AI would include Stable Diffusion, Hugging Face, and Llama (created by Meta). Closed Source AI would include ChatGPT and Google’s Bard.
Prompts – Instructions for an AI. It is the main way to steer the AI in a particular direction, indicate intent, and offer context. It can be time-consuming if the task is complex.
Prompt Engineer – An advanced user of AI models, a prompt engineer doesn’t possess special technical skills but is able to give clear instructions so the AI returns results that most closely match expectations.
This skill can be compared to a psychologist who is working with a client who needs help expressing what they know.
Red Teaming – Testing an AI by trying to force it to act in unintended or undesirable ways, thus uncovering potential harms.
The term comes from a military practice of taking on the role of an attacker to devise strategies.
While some of these definitions are a bit of an oversimplification, they will point the beginner in the right direction. -Stephen Goforth
Becoming is a service of Goforth Solutions, LLC / Copyright ©2025 All Rights Reserved