Author: admin

  • Oh, Video… are you true?

    This is how insider trading REALLY works… or does it? 👀 #shorts #trading #donaldtrump

    At first glance, it’s a quick, punchy, and honestly pretty convincing video. The speaker’s presence, the graphics, and the whole tone all work together exactly the way this kind of content is designed to spread. But that’s also what made me stop and want to take a second look at it. It’s not just the lack of context or sources, or how it feels more geared toward getting a reaction than actually informing me, the viewer. It’s the whole package: the layout, the title, and that catchy but mysterious background tune pulling you in. That’s why I decided to try and break this one down using a lateral reading approach.

    Turns out, it’s a lot harder in practice than John from Crash Course Navigating Digital Information makes it appear, especially with these dang short-form videos. My two little brain cells almost caught fire….


    Step 1: Stop – What is the claim, and what feels off?

    The video presents a claim about that Donald Trump just move $580 million… with one post. Which is to say that Donald Trump knew this would happen and planned it.

    The red flags I noticed immediately:

    • No other source cited in the video except the graph in the video that states Source: Bloomberg
    • Strong and confident messaging tone with impacting and mysterious soundtrack without any additional sourcing or specific evidence
    • A very short format video, no longer than 34 seconds, and purely focused on Donald Trump, the trade, and the mystery of it all…
    • Feels like it’s simplifying something far more complex than just a suspicious trading deal just before the President speaks of a delay in bombing if Iranian energy infrastructure.

    This is exactly the kind of short, attention-grabbing videos that have been discussed in the module. Fast, emotionally charged, and incredibly easy to share without really thinking it through.


    Step 2: Investigate the Source

    Next, I researched the YouTube channel itself.

    Here are a couple of basic questions I wanted to answer:

    • Who runs this account?
    • Do they cite sources in other videos?
    • Are they informational, opinion-based, or entertainment?

    Here’s what I found:

    • From what I found from the YouTube and Instagram descriptions, Thinknomy presents itself as a channel focused on wealth, entrepreneurship, discipline, and a success-driven mindset by using storytelling (short clips from movies, news, or other related media). They pride themselves in creating insights from “thinkers” of the day in order to inspire growth and achievement at the individual level. Their messaging focuses on turning ideas into action through consistency and intentional effort. The channel has approximately 4.66K subscribers and 47 videos on YouTube with a total viewership of 4.3 million and has been active in the YouTube space since December 26th 2025, yet the video in question has gained significant traction with 1.6 million views, 1,065 comments, and was posted on 28th March 2026. Thinknomy also maintains an Instagram presence; however, there is no clear information on who operates, funds, or determines its content, raising questions about transparency and credibility.
    • No clear credentials or sourcing tied to the claim or the channel, for that matter.

    That doesn’t automatically make it false, but it does mean I shouldn’t trust it at face value.


    Step 3: Lateral Reading – What do other sources say?

    Instead of staying on YouTube, I opened some new tabs and conducted a search:

    Did Trump participate in or provide information that led to this big trade before he posted on Truth Social? Are we looking at a possible inside trading, or is this a coincidence?

    Reputable News Source #1

    I checked coverage from BBC News


    👉 Oil traders bet millions ahead of Trump’s Iran talks post

    Findings:

    • Here’s a kicker: in order to review this story from the BBC, there’s a paywall. In fact, many of the “reputable news outlets” require some form of subscription or login with your email to access their content. WSJ, The New York Times, BBC, Reuters, and many others all have the model. What really got me was that if I even wanted to verify the graph in the video, I would need to pay to access the Bloomberg website.
    • Provides me with nothing due to the paywall.

    You’ve got to pay to read their “truth” — and subscribe just to question it.


    Reputable News Source #2

    I also looked at CBS News online


    👉 Oil trades surged just before Trump’s post on Iran talks. Some experts are suspicious. – CBS News

    Findings:

    • The CBS News article reports a real spike in oil trading right before Trump made his announcement on the pausing of bombing of Iran. The article goes on to note that financial experts found the timing of Trump’s Truth Social posting and the sudden spike in trading “suspicious”. However, it stopped short of offering proof of insider trading. It does attempt to offer potential explanations, including coincidence and algorithmic trading. In my comparison to the YouTube short, the content creator simplifies this event into a 34-second video that may imply wrongdoing. While the CBS article does confirm this type of large trading is unusual, and its timing is strange, it does try to add context that the video doesn’t even attempt to engage. Overall, the reporting does not support the video’s tone and implied conclusion and highlights how missing context can turn a questionable event into a misleading narrative or worse, take a truth and distort it.
    • It does, however, confirm that a significant and interesting trade regarding securities and crude oil did occur within minutes of Trump’s posting on the 23rd of March 2026; it does not contradict the video in that sense, but the article doesn’t imply Trump is the master mind behind the trades as the video’s title would suggest.

    This source suggests the issue is potentially more complex than the video suggests.


    Step 4: Fact-Checking Source

    👉 Snopes


    https://www.snopes.com

    RumorGuard from the News Literacy Project

    Findings:

    • I couldn’t find anything on either of these sites.
    • However, when I looked at different news outlets (those I could access without paying or giving up my email) that covered the $580M oil trades before Donald Trump’s post, they were consistent with the facts surrounding the matter and mostly the same details, but the ole spin on Trump didn’t stay the same. Outlets like CBS News and Fortune kept thing cautious, using words like “suspicious” without going hard in the paint. Meanwhile, others like The Daily Beast or Media Matters for America climbed over each other and did more than just hint at insider trading. Even the more extreme outlets accused Trump directly.

    is this true? Donald Trump just move $580 million… with one post – Google Search

    “Did Donald Trump really move $580 million with one post?” That’s what I tossed into the good ole trusty Google search machine, and the algorithm bequeathed me the following…

    That after digging through a handful of “credible” sources, Google was kinda all over the place, offering most news sources leaning liberal, a couple hovering in the center, and one in the top ten you could consider conservative; however, all telling the same story with slightly different spins depending on their audience. Same facts, different flavor. Even AI tends to lean a little left of center, so yeah… I’m not exactly taking anything at face value here.


    Step 5: Check Context

    This part matters more than people truly understand.

    I looked at:

    • Date – Is this old information being reused?
    • Location – Is this specific to one place?
    • Missing details – What’s not being said?

    In this case:

    • So it’s not an old posting, and the event it’s showing is likely still developing. It focuses on the suspicious nature and timing of Donald Trump’s posting and the time the trades where exicuted. What stood out to me is what’s missing in the video: there’s no proof of who made the trades, no confirmation of insider knowledge, and no expert sources. It leans heavily on suspicious timing without real evidence. Overall, it grabs attention but feels incomplete and leaves out key details needed to fully understand the situation.

    This is where a lot of misinformation lives—not in completely false claims, but in missing context.


    My Humble Assessment

    After walking through all of this, my conclusion is that the claim in the video is:

    👉 Misleading but partially true.

    Here’s why:

    • It lacks detailed sourcing and context
    • It oversimplifies a more complex issue and time
    • Other credible sources neither specifically contradict. But do offer important context
    • There is no specific data to support the claim

    Why This Matters

    This whole process took me about two and a half hours (way too many rabbit holes… I almost fell into one and never got out… lol), and honestly, it all but confirmed something I had already suspected: that most short videos are built to hit your emotions, not actually inform you.

    Trying to use lateral reading here was a bit of a dead end for me. The content creator didn’t provide any sources, just vague claims and a “Is this genius… or manipulation? Googly eyes and comment your opinion,” which is fine if you treat it like entertainment or a thought exercise and just move on after.  But as you’ll see if you spend just a few moments in the comments, you’ll quickly notice this video only validated and supported the viewers’ perspective and treated it as evidence of corruption.

    One solid takeaway or perspective I would recommend that everyone consider: every outlet has some level of bias; political, religious, and ideological. That doesn’t make them liars or untrustworthy directly, but it does mean their content is or likely will be influenced. Bottom line: don’t look for just the perfect… or… your… truth.

  • Learning The Hard Way

    I believe we can all agree that the concept of misinformation isn’t new. Humans have, at least since the written and spoken word, engaged in some form of misinformation. Heck, I’ve been around long enough to remember when bad info was primarily spread by word of mouth, print, and TV, not by today’s social media, system algorithms, and infinite doom scrolling in your favorite feed. The difference today and likely in the future, as technologies adapt and improve, is in the speed, access, and our ability to interact with this influx of “information”. You don’t just read or hear something these days; you can see it, react in an instant, and share it before you’ve even considered taking a moment to think the content through.

    So in this module, we explore these three games and credibility tools in an attempt to offer us some insight into the motivations and means of how a person or entity could manipulate information and provoke reaction or action. I looked at two of them: RumorGuard and the Bad News game.


    Bad News – The Way the Game Was Meant to be Played

    https://www.getbadnews.com/books/english

    Bad News takes a completely different approach to exposing you to the machine of development and dissemination of information. You play as the bad actor. You’re building a fake news brand, gaining followers, and trying to maintain just enough credibility to keep people hooked.

    For me, this was an interesting little game. As you start playing and earning followers, you start focusing on content that you’ve seen before, either figuring out how to spin it for more traction or questioning whether you believe any of it at all. It kind of forces you into this gray area: meet the intent of the game or be a better person and refrain. The game subtly pushes you to make decisions just to keep progressing, and before you know it, you’re playing along. You start to notice the tactics pretty quickly:

    • Push outrage or keep it subtle?
    • Go full conspiracy or stay believable?
    • How far can I go before people stop trusting me?

    And that’s the point, you quickly start to realize misinformation isn’t always just about outright lies and crazy conspiracies. It’s the art of framing the content, leveraging an emotionally desired response, and timing to maximize both reach and awareness in order to build on credibility.

    In some research, I found an article (from Module 1) where Joan Donovan describes the lifecycle of media manipulation. Planning, seeding, responses, changes to the information ecosystem, and finally, adjustment. This game walks you through those stages, not directly but indirectly, by earning 7 badges or tools. You’re basically running your own mini-influence operation from behind the desk.

    I also found a YouTube video where Kate Starbird suggests that misinformation spreads because of emotional appeal. We, the viewer, find a connection with the content, and what appears to be the most effective and fastest is the emotional connection. It’s difficult to discredit emotion, maybe the facts surrounding an emotional response, but not entirely the response itself. This game leans heavily into that when leveraging anger and outrage (emotional response). You can build engagement quickly and likely build solid loyalty from those who engage with the content. The developers describe these games as a form of “psychological vaccine,” and when using these tactics yourself, in the games, the theory is that we can start to recognize manipulation in the content we review.

    Is it perfect? No, but nothing really is, and truth, I’m even skeptical of the game itself. I felt manipulated, and the game felt very linear, guiding me down a specific path with the perception of choice. That said, I am confident in my limited research that most misinformation involves some form of bots, algorithms, and bigger networks than this game can really demonstrate. But from a pure learning standpoint, I think it gives you some basic awareness of how bad actors could be using these techniques and tools as demonstrated to manipulate information in an unrestricted digital ecosystem.


    RumorGuard – Wait, that’s not true?

    https://www.newslit.org/tools/rumorguard

    RumorGuard is a “fact-checking” tool. It takes real viral posts, headlines, images, and social media claims, and breaks them down using things like verified evidence, context, and source credibility. RumorGuard offers a reasonable and plausible explanation of what’s wrong (or right) with the information and gives the viewer the sources to validate on their own terms.

    Using it feels a lot like sitting through a solid brief. It’s straightforward, informative, and built on a logic matrix. What stood out to me is how much this lines up with research from Gordon Pennycook (read the abstract – it’s the only part that is free… lol) His abstract states “Together with additional computational analyses, these findings indicate that people often share misinformation because their attention is focused on factors other than accuracy—and therefore they fail to implement a strongly held preference for accurate sharing.”

    In my reading of the statement, people often share misinformation not because they believe it, but because they’re not paying attention to what they are consuming. RumorGuard offers a means to take a moment and “pay attention”, that is, if you choose to do so.

    So why don’t more people use tools like RumorGuard? Simple, it’s passive, you must want to question, step out of your comfort zone or bubble, and ask what I’m reading, watching, or feeling meaningful and honest. Why should knowing the truth be more important than my ideology and moral concerns? How do I use this new awareness?


    My Final Thoughts – Questioning my questions

    Between the Bad News game and RumorGuard, both attempt to build a sense of awareness. What really caught me off guard was how easy it was to use RumorGuard and other tools like it.  I generally pull my news from several sources. I use the Ground News app, and I review information from several sources with an understanding of the source’s motivations. If we genuinely want to learn and be better informed, tools like RumorGuard and others like it are a start. If we take the moment to see each other and not the content, we might all find common ground on the important matters of our coexistence.

  • A 24-Hour Media Diary: What I Consumed and What I Questioned

    I’ve got to say… this assignment was much harder than I thought it would be. Tracking one’s own media consumption across multiple platforms and devices is honestly a pain in the you-know-what. So I got a little creative. Truth be told, it also freaked me out a bit.

    Hear me out… I watch a lot of random stuff on Facebook videos and YouTube (yeah, I’m old enough to remember Myspace…lol). Come to find out, you can download your entire activity history from Google, across all devices. That realization alone was a little unsettling. Google really does track everything you do online.

    Anyway, back to the assignment. Instead of using the typical “I woke up at…” approach, I pulled metadata from my activity history over a little more than a 24-hour period. I also marked videos I found interesting or questionable by saving them as I was watching them. Below is a sample of what I watched, and fair warning, I’m all over the place.

    I didn’t change my habits at all, I just paid more attention to what I normally do and watch. Most of my media consumption came from YouTube, along with some quick Google searches and general doom scrolling through Facebook videos like anyone else.

    What surprised me wasn’t just how much media I consume (it’s a lot), but also that at some level, there are questionable credibility and creators. Not necessarily fake, but exaggerated, biased, or missing context.


    Media Diary

    March 28, 2026

    9:00 a.m. – 11:30 p.m.
    Started off watching longer YouTube videos about an RV tour (in the market soon) and one about rattlesnakes and how their rattles work. These felt pretty normal and informative, slower pace, nothing really raised red flags.

    12:30–1:00 p.m.
    Watched a couple more videos about gaming and general aviation, and watched some updated news regarding the recent Air Canada crash. Mostly entertainment or educational content. Nothing stood out as misleading. Also had lunch with my wife at home. We’re big fans of a good cheese plate.

    1:00–7:00 p.m.
    Played Star Citizen with friends. This game has been in development since 2012 and is crowdfunded—I’ve definitely invested more money into it than I probably should have. Since this was a big block of time, I’ll mention a YouTube creator I follow for this game—BoredGamer. I’d consider him a pretty reliable source within that niche.

    7:00 p.m.
    Jumped back on YouTube and started scrolling casually. Saw a mix of home improvement, memes, and random clips.

    • Most Insane Cancel Culture Story” caught my attention right away—it already sounded a bit over the top. The video talks about a guy who raised money holding a Bush Light sign, donated it, then got investigated by a journalist over old social media posts from when he was 16. The company pulled support, public backlash followed, and eventually the journalist got fired due to his own past posts. The video tried to frame it like public outrage caused the journalist’s firing, which is only partially true, but it felt like there was more to the story than what was being presented.

    8:00–10:45 p.m.
    This is where things started to shift. I was watching short clips between homework and dinner.

    A few that stood out:

    The first and last one made me pause. I’ve heard that claim before and knew it didn’t sound right or atleast misleading at best, flat-out wrong at worst.

    March 29, 2026

    9:00–10:00 a.m.
    Heavy scrolling, mostly YouTube Shorts. Watched things like:

    • “$102 Million CASH OUT
      • This one caught my attention because the creator, Infinite Wealth Lab, uses movie clips to promote financial advice.
    • “Saved MILLIONS in TAXES By REFUSING the DELIVERY”
    • Navy Seal Reveals Why All Moms Should Carry a Gun
      • This one was interesting. It strongly pushes the idea of gun ownership through a worst-case scenario involving children. Regardless of where someone stands on that issue, the video clearly leans into fear and emotional impact to make its point.

    At this point, a pattern was pretty obvious from the content I was now consuming. Big claims, strong opinions, and a lot of authority being used to sell the message.

    1:00–3:30 p.m.
    More Shorts, even faster pace:

    • “PRICE of LOYALTY? $1 MILLION in CRYPTO.”
    • What If Young People Paid Zero Tax?”
      • This video from Triggernometry proposes no income tax until age 30 and none after 60. It encourages economic participation and suggests alternatives to things like Social Security. Interesting idea, but clearly framed as a strong opinion.
    • “The baby had new marks every morning. The dog knew…”

    This was probably the most questionable stretch. Everything was quick, emotional, and clearly designed to get a reaction more than encourage critical thinking.

    What Actually Seemed Questionable

    Most of what I saw wasn’t outright fake—it was more subtle than that. It felt like half-truths, exaggerated claims, or information framed in a specific way to push a certain reaction.

    The law enforcement videos stood out the most. Titles like “tyrant cop” or “gets owned” immediately tell you how you’re supposed to feel before you even watch. That’s not neutral—it’s framing. Even if the situation is real, you’re only seeing a snapshot with limited context, and the creator is setting the tone for how you interpret it.

    Then there are the emotional videos, like the one about the baby and the dog. Those are probably the easiest to believe because they hit you emotionally first. But once you slow down and think about it, there’s usually no real source or proof backing them up.

    How I Tried to Fact-Check

    For some of these, I actually stopped and started looking things up.

    For the “porcelain Glock” idea being discredited by WarriorPoetSociety (referencing a scene from Die Hard 2), I checked reliable sources, including manufacturer information. Turns out it’s a myth and a significantly perpetuated one. Glocks absolutely contain metal components and are detectable on any medal detector, especially airport systems. This confirmed my initial reaction that the video was true

    Another example was a story about a woman dying in a hyperbaric chamber accident involving a horse. My first reaction was, “No way that can’t be real.” But after digging into it, I found that it was a real incident.

    So my approach became pretty simple; if I couldn’t verify something fairly quickly through credible sources, such as searching through a variety of “agreed upon services” validate the credible source, and then bounce against several AI platforms like ChatGPT and Grok, I wasn’t going to treat it as reliable or share it.

    Reflection

    This exercise made me realize that I come across way more questionable content than I could have ever imagined, and it’s not always obvious. The idea that every bit of information and content has to be verified, I fear, most of us would rather just take the influx of information at face value until someone challenges us.

    Most of it comes down to how things are presented:

    • Big, attention-grabbing claims
    • Emotional hooks
    • Strong opinions framed like facts

    I also noticed how quickly the algorithm locks you into a pattern of content. Watch one “bad cop” video, and all of the sudden you’re seeing five more videos, all within the same framing. Same thing with financial content, pet videos, or political takes.

    A significant takeaway for me is how easy it is to just keep scrolling through content without really questioning anything your seeing. Most of this content is designed to be consumed quickly, not critically. It scratches an itch, entertainment, or self-validation, and over time, it can reinforce whatever you already believe.

    I don’t think all of this is intentional, but I do think there’s an economic incentive behind it. Platforms benefit from engagement, and content creators know what gets clicks. That said, there’s definitely potential for and likely has been done to deliberately engage in manipulation.

    After going through this, I’d say I’m definitely more skeptical. Especially with short-form content like YouTube Shorts and fast-paced Facebook videos. I don’t use TikTok, but I imagine it’s the same. If something sounds extreme or too clean of a story, it probably deserves a second look.

  • LOOK AT ME – I’m A BLOGGER!

    This is my first digital words on this page….. and that’s all I have to say for now.