QUANT BOUNTY A teenager in California recently shocked his family by making US$ 50,000 in minutes through cryptocurrency. The 14 year old created a meme coin called ‘QUANT’ on Pump.Fun – a platform for launching such tokens.

After buying five percent of the coin’s supply for 350 dollars, he livestreamed and triggered a buying frenzy. Within eight minutes, his holdings were worth nearly US$ 30,000, which he promptly cashed out.

He repeated the process with two more coins, and profited from what is known as a ‘soft rug pull’ where creators dump their holdings and cause a coin’s value to crash. The backlash was swift and traders accused the teen of unethical behaviour. And his family faced threats that overshadowed any financial success.

While meme coins have grown in popularity, their unregulated backdrop often leaves investors vulnerable to scams. Platforms such as Pump.Fun aim to standardise token creation but struggle to prevent abuses such as soft rug pulls. Critics argue that such actions exploit regulatory gaps by allowing unethical but potentially legal profiteering.

Although the teen made money, this incident raises questions about ethics and accountability in the crypto space – especially as platforms such as Pump.Fun democratise access. The unregulated nature of meme coins remains a breeding ground for both innovation and controversy.

ROBOT RISKS Of late, researchers have exposed vulnerabilities in large language models (LLMs) and revealed how these systems can be manipulated to produce harmful outputs. Experiments have shown that such misbehaviour extends to the physical world when LLMs power robots.

From self-driving cars that ignore stop signs to robotic dogs that enter restricted areas, the risks are mounting.

Researchers at the University of Pennsylvania used a method called ‘RoboPAIR’ to generate jailbreak prompts that bypass safety rules. These attacks allowed robots to execute unsafe actions such as detonating bombs or spying on people. The vulnerabilities highlight the dangers of integrating LLMs into physical systems without robust safeguards.

As LLMs expand into applications such as self-driving cars and medical tools, their susceptibility to manipulation poses serious concerns.

Experiments reveal that even multimodal models, which process images and text, can be tricked into unsafe behaviour by cleverly phrased commands. The ‘attack surface’ grows as AI systems become more interactive.

This underscores the urgent need for better security measures in robots powered by artificial intelligence. Without proper guardrails, the consequences of these vulnerabilities could be catastrophic – both in controlled settings and real world applications.

CLOSE WINDOWS Microsoft is urging users to upgrade their ageing Windows 10 PCs as the October 2025 ‘end of support’ date looms. With security updates ceasing, the company is promoting Windows 11’s new features, and the improved performance and security offered by modern hardware.

Chief Marketing Officer Yusuf Mehdi frames this transition as part of a broader push for the year of the Windows 11 PC refresh, and encourages users to embrace new devices that meet current hardware standards. Many older PCs, while potentially capable of running Windows 11, may lack official support and leave them vulnerable once updates stop.

For those who are resistant to upgrading, Microsoft offers an Extended Security Update (ESU) programme. However, it comes at a cost – US$ 30 for individuals and escalating fees for organisations, depending on their needs.

The stakes are also high. With Windows 10 still dominating market share, leaving these systems unprotected could create widespread vulnerabilities. Despite Microsoft’s efforts to incentivise upgrades, replacing the massive number of ageing Windows 10 machines in time remains a challenge.

This transition underscores the company’s strategy to balance user needs.

AID TECH BOT In the midst of Lebanon’s crisis, Hania Zaatari – a mechanical engineer attached to the Ministry of Industry – developed a WhatsApp based chatbot called ‘Aidbot’ to bridge the gap between displaced families and available aid.

Driven by the destruction of homes and livelihoods caused by escalating conflicts, Zaatari’s initiative provided a streamlined solution to distribute essential supplies such as food, mattresses, blankets and medicine to those in need.

She used her programming skills to create a platform that efficiently gathers information about the needs of displaced individuals.

Aidbot collects basic details such as location and the required aid, and stores this data in a shared Google spreadsheet for systematic distribution. By automating the process, Aidbot greatly reduces the time spent managing requests, and enables faster and more effective aid delivery.

Research Fellow at the Overseas Development Institute (ODI) John Bryant praised the cultural relevance of this initiative, which emphasises the importance of local knowledge and trusted human interaction in enhancing digital tools for humanitarian aid.

Though it may not solve all the problems, Aidbot offers a more efficient and community driven approach to solutions.