<?xml version="1.0" encoding="utf-8" standalone="yes"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom">
  <channel>
    <title>Gemini 3 Flash on Melabit</title>
    <link>https://melabit.com/en/tags/gemini-3-flash/</link>
    <description>Recent content in Gemini 3 Flash on Melabit</description>
    <generator>Hugo</generator>
    <language>en</language>
    <lastBuildDate>Mon, 16 Mar 2026 18:00:00 +0000</lastBuildDate>
    <atom:link href="https://melabit.com/en/tags/gemini-3-flash/index.xml" rel="self" type="application/rss+xml" />
    <item>
      <title>Antigravity: the LLM does it better</title>
      <link>https://melabit.com/en/2026/03/16/antigravity-the-llm-does-it-better/</link>
      <pubDate>Mon, 16 Mar 2026 18:00:00 +0000</pubDate>
      <guid>https://melabit.com/en/2026/03/16/antigravity-the-llm-does-it-better/</guid>
      <description>&lt;p&gt;Over the last few weeks, I’ve been writing about my experiments with Antigravity, or rather, with the (more or less) &lt;em&gt;intelligent agents&lt;/em&gt; integrated into the editor. The results have been mixed: sometimes &lt;a href=&#34;https://melabit.com/en/2026/01/21/an-unexpected-antigravity/&#34;&gt;the agents proved to be very effective&lt;/a&gt;, accurately easing some complex or repetitive tasks; in other cases &lt;a href=&#34;https://melabit.com/en/2026/02/16/antigravity-from-surprise-to-doubt/&#34;&gt;they didn&amp;rsquo;t accomplish anything worthwhile&lt;/a&gt;, only wasting a huge amount of time.&lt;/p&gt;&#xA;&lt;h4 id=&#34;coding-badly-with-an-llm&#34;&gt;Coding badly with an LLM&lt;/h4&gt;&#xA;&lt;p&gt;Among all the more or less useful applications of LLMs, one of the most controversial is the idea that anyone can use an LLM to program, even when they don&amp;rsquo;t know the first thing about it. This is the so-called &lt;em&gt;vibe coding&lt;/em&gt;, a concept that can be &lt;a href=&#34;https://x.com/karpathy/status/1886192184808149383&#34;&gt;summarized more or less like this&lt;/a&gt;: &amp;ldquo;I just see stuff, say stuff, run stuff, and copy paste stuff, and it mostly works&amp;rdquo;. This is not a quote from &lt;a href=&#34;https://www.youtube.com/watch?v=-w9EBTB8a3Y&#34;&gt;Nanni Moretti&lt;/a&gt;, a famous italian actor and director; it was reported by Andrej Karpathy, one of the founders of OpenAI.&lt;/p&gt;&#xA;&lt;p&gt;Now, setting aside the many issues regarding security, the maintainability of code produced without really knowing what it does, and the &lt;a href=&#34;https://taylorandfrancis.com/knowledge/Engineering_and_technology/Engineering_support_and_special_topics/Regression_to_the_mean/&#34;&gt;regression to the mean&lt;/a&gt;, which here implies that the quality of AI-produced code will inevitably tend to collapse, it only takes a bit of personal experimentation, even with simple projects, to realize that the very idea of &lt;em&gt;coding for everyone&lt;/em&gt; doesn&amp;rsquo;t work. If you want to get something good out of AI, you have to know exactly what you want and how you want it.&lt;/p&gt;&#xA;&lt;p&gt;I tried &lt;em&gt;vibe coding&lt;/em&gt; myself: I asked Gemini to &lt;a href=&#34;https://melabit.com/en/2026/03/04/antigravity-a-driver-written-by-ai/&#34;&gt;write a driver for an Arducam camera&lt;/a&gt;, explaining only the bare essentials and waiting for it to do everything on its own.&lt;sup id=&#34;fnref:1&#34;&gt;&lt;a href=&#34;#fn:1&#34; class=&#34;footnote-ref&#34; role=&#34;doc-noteref&#34;&gt;1&lt;/a&gt;&lt;/sup&gt;&lt;/p&gt;&#xA;&lt;p&gt;However, this approach did not work. It took hours upon hours of work, errors, inconclusive answers, corrections, and advice from me to reach an acceptable result.&lt;/p&gt;&#xA;&lt;p&gt;And this was only possible because I eventually abandoned the idea of letting Gemini do everything and provided it with specific advice on how to proceed. Without that, it would have only been a massive waste of time.&lt;/p&gt;&#xA;&lt;div style = &#34;border-style: solid; border-width: 0px 0px 0px 12px; border-color: #03a87c; background-color: #defef6; padding: 1em; color:black; font-size:100%;&#34;&gt;&#xA;&lt;p&gt;&lt;a href=&#34;https://emeralit-muloblog.statichost.page/&#34;&gt;Mimmo&lt;/a&gt; also tried, on his own, to have &lt;a href=&#34;https://lumo.proton.me&#34;&gt;Lumo&lt;/a&gt; write a Micropython program to manage my Arducam Mini 5MP Plus with the Pico. Even Lumo, which is undoubtedly one of the best chatbots around, couldn&amp;rsquo;t do it.&lt;/p&gt;&#xA;&lt;/div&gt;&#xA;&lt;/br&gt;&#xA;&lt;p&gt;&lt;img src=&#34;https://melabit.com/img/2026-03-16-antigravity-l-llm-lo-fa-meglio/Gemini_Generated_Image_jb08gsjb08gsjb08.jpg&#34; alt=&#34;&#34;&gt;&#xA;&amp;ndash; &lt;em&gt;Image generated by &lt;a href=&#34;https://gemini.google.com&#34;&gt;Google Gemini&lt;/a&gt;.&lt;/em&gt;&lt;/p&gt;&#xA;&lt;h4 id=&#34;coding-better-with-the-same-llm&#34;&gt;Coding better with the same LLM&lt;/h4&gt;&#xA;&lt;p&gt;This doesn&amp;rsquo;t mean that an LLM can&amp;rsquo;t be useful when programming; you just have to use it properly. You must explain exactly what you want it to do, provide all the necessary materials, and set boundaries within which it must operate.&lt;/p&gt;&#xA;&lt;p&gt;In short, you have to treat it like a graduate student or an intern, with little experience but a great desire to learn.&lt;/p&gt;&#xA;&lt;p&gt;It only took two carefully written prompts to make a difference. The first defined the project&amp;rsquo;s directory structure and instructed the agent to use &lt;code&gt;git&lt;/code&gt; for version control and &lt;code&gt;uv&lt;/code&gt; to install any necessary Python packages,&lt;/p&gt;&#xA;&#xA;&lt;details class=&#34;question&#34; open&gt;&#xA;    &lt;summary class=&#34;question&#34;&gt;&#xA;        Prompt&#xA;    &lt;/summary&gt;&#xA;    &lt;div class=&#34;question&#34;&gt;&#xA;        &#xA;    &#xA;    &lt;p&gt;Start a new project and name it &lt;code&gt;project13-pico-camera5mp-reprise&lt;/code&gt;.&lt;/p&gt;&#xA;    &lt;p&gt;Read the &lt;code&gt;AGENTS.md&lt;/code&gt; file that defines your basic behaviour.&lt;/p&gt;&#xA;    &lt;p&gt;Create a new directory with this name and initialize an empty &lt;code&gt;git&lt;/code&gt; repository using &lt;code&gt;git init&lt;/code&gt;&#xA;    Use &lt;code&gt;uv&lt;/code&gt; for managing Python packages.&#xA;    All the code must stay in the &lt;code&gt;code&lt;/code&gt; directory, with no subdirectories within it.&#xA;    This project does not need a &lt;code&gt;data&lt;/code&gt; directory and the &lt;code&gt;output&lt;/code&gt; directory must be renamed as &lt;code&gt;images&lt;/code&gt;.&lt;/p&gt;&#xA;    &#xA;&#xA;&lt;div&gt;&#xA;&lt;/details&gt;&#xA;&#xA;&lt;p&gt;while the second prompt contained everything I could think of to help it work.&lt;/p&gt;&#xA;&#xA;&lt;details class=&#34;question&#34; open&gt;&#xA;    &lt;summary class=&#34;question&#34;&gt;&#xA;        Prompt&#xA;    &lt;/summary&gt;&#xA;    &lt;div class=&#34;question&#34;&gt;&#xA;        &#xA;    &#xA;    &lt;p&gt;I have a Rasperry Pi Pico 2W connected to a Arducam 5MP Plus OV5642 Mini Module Camera Shield (&lt;a href=&#34;https://www.arducam.com/arducam-5mp-plus-spi-cam-arduino-ov5642.html%29&#34;&gt;https://www.arducam.com/arducam-5mp-plus-spi-cam-arduino-ov5642.html)&lt;/a&gt;.&lt;/p&gt;&#xA;    &lt;p&gt;I have connected the camera to the Pico using the standard layout found in the documentation:&lt;/p&gt;&#xA;    &lt;table&gt;&#xA;      &lt;thead&gt;&#xA;          &lt;tr&gt;&#xA;              &lt;th&gt;Camera&lt;/th&gt;&#xA;              &lt;th&gt;CS&lt;/th&gt;&#xA;              &lt;th&gt;MOSI&lt;/th&gt;&#xA;              &lt;th&gt;MISO&lt;/th&gt;&#xA;              &lt;th&gt;SCK&lt;/th&gt;&#xA;              &lt;th&gt;GND&lt;/th&gt;&#xA;              &lt;th&gt;VCC&lt;/th&gt;&#xA;              &lt;th&gt;SDA&lt;/th&gt;&#xA;              &lt;th&gt;SCL&lt;/th&gt;&#xA;          &lt;/tr&gt;&#xA;      &lt;/thead&gt;&#xA;      &lt;tbody&gt;&#xA;          &lt;tr&gt;&#xA;              &lt;td&gt;Pico&lt;/td&gt;&#xA;              &lt;td&gt;GP5&lt;/td&gt;&#xA;              &lt;td&gt;GP3&lt;/td&gt;&#xA;              &lt;td&gt;GP4&lt;/td&gt;&#xA;              &lt;td&gt;GP2&lt;/td&gt;&#xA;              &lt;td&gt;GND&lt;/td&gt;&#xA;              &lt;td&gt;3V3&lt;/td&gt;&#xA;              &lt;td&gt;GP8&lt;/td&gt;&#xA;              &lt;td&gt;GP9&lt;/td&gt;&#xA;          &lt;/tr&gt;&#xA;      &lt;/tbody&gt;&#xA;    &lt;/table&gt;&#xA;    &lt;p&gt;The main documentation for the camera can be found at these links:&lt;/p&gt;&#xA;    &lt;ul&gt;&#xA;    &lt;li&gt;&lt;a href=&#34;https://docs.arducam.com/Arduino-SPI-camera/Legacy-SPI-camera/Introduction/&#34;&gt;https://docs.arducam.com/Arduino-SPI-camera/Legacy-SPI-camera/Introduction/&lt;/a&gt;&lt;/li&gt;&#xA;    &lt;li&gt;&lt;a href=&#34;https://docs.arducam.com/Arduino-SPI-camera/Legacy-SPI-camera/Hardware/Arducam-Shield-Mini-5MP-Plus/&#34;&gt;https://docs.arducam.com/Arduino-SPI-camera/Legacy-SPI-camera/Hardware/Arducam-Shield-Mini-5MP-Plus/&lt;/a&gt;&lt;/li&gt;&#xA;    &lt;li&gt;&lt;a href=&#34;https://docs.arducam.com/Arduino-SPI-camera/Legacy-SPI-camera/Software/Quick-Start-Guide/&#34;&gt;https://docs.arducam.com/Arduino-SPI-camera/Legacy-SPI-camera/Software/Quick-Start-Guide/&lt;/a&gt;&lt;/li&gt;&#xA;    &lt;li&gt;&lt;a href=&#34;https://docs.arducam.com/Arduino-SPI-camera/Legacy-SPI-camera/Pico/Camera-Module/SPI-Camera/&#34;&gt;https://docs.arducam.com/Arduino-SPI-camera/Legacy-SPI-camera/Pico/Camera-Module/SPI-Camera/&lt;/a&gt;&lt;/li&gt;&#xA;    &lt;li&gt;&lt;a href=&#34;https://www.uctronics.com/download/Image_Sensor/OV5642_DS.pdf&#34;&gt;https://www.uctronics.com/download/Image_Sensor/OV5642_DS.pdf&lt;/a&gt;&lt;/li&gt;&#xA;    &lt;li&gt;&lt;a href=&#34;https://blog.arducam.com/downloads/shields/ArduCAM_Camera_Shield_Software_Application_Note.pdf&#34;&gt;https://blog.arducam.com/downloads/shields/ArduCAM_Camera_Shield_Software_Application_Note.pdf&lt;/a&gt;&lt;/li&gt;&#xA;    &lt;li&gt;&lt;a href=&#34;https://www.uctronics.com/download/Amazon/B0067-B0068-Pico.pdf&#34;&gt;https://www.uctronics.com/download/Amazon/B0067-B0068-Pico.pdf&lt;/a&gt;&lt;/li&gt;&#xA;    &lt;/ul&gt;&#xA;    &lt;p&gt;The code to drive this camera with the Pico is written in C (i.e., for Arduino) and CircuitPython, and can be found here: &lt;a href=&#34;https://github.com/ArduCAM/PICO_SPI_CAM&#34;&gt;https://github.com/ArduCAM/PICO_SPI_CAM&lt;/a&gt;. You can download all the files you need from this repository.&#xA;    In particular, the files for CircuitPython are in the &lt;code&gt;Python&lt;/code&gt; directory:&lt;/p&gt;&#xA;    &lt;ul&gt;&#xA;    &lt;li&gt;&lt;code&gt;Arducam.py&lt;/code&gt;, which contains the functions to drive the camera,&lt;/li&gt;&#xA;    &lt;li&gt;&lt;code&gt;OV5642_reg.py&lt;/code&gt; which, as far as I have understood, defines the resolutions  allowed  when taking pictures, and&lt;/li&gt;&#xA;    &lt;li&gt;&lt;code&gt;boot.py&lt;/code&gt; , but I don&amp;rsquo;t know what it does.&#xA;    In the same directory there is also the script &lt;code&gt;ArduCAM_Mini_5MP_Plus_VideoStreaming.py&lt;/code&gt; which sends the video captured by the camera to a Windows application and that is of no use here.&lt;/li&gt;&#xA;    &lt;/ul&gt;&#xA;    &lt;p&gt;Please write a basic CircuitPython script that uses the functions defined in &lt;code&gt;Arducam.py&lt;/code&gt; to test that the camera works and can take photos.&lt;/p&gt;&#xA;    &#xA;&#xA;&lt;div&gt;&#xA;&lt;/details&gt;&#xA;&#xA;&lt;p&gt;These two better-crafted prompts allowed &lt;code&gt;Gemini 3 Flash&lt;/code&gt; to quickly generate a CircuitPython script that runs directly on the Pico and performs all the requested tasks.&lt;/p&gt;&#xA;&lt;p&gt;Actually, the very first version of the script didn&amp;rsquo;t work. But Gemini was able to modify just a single line in the &lt;code&gt;Arducam.py&lt;/code&gt; driver provided by the manufacturer to fix it (specifically, it removed a zero from the frequency value).&lt;sup id=&#34;fnref:2&#34;&gt;&lt;a href=&#34;#fn:2&#34; class=&#34;footnote-ref&#34; role=&#34;doc-noteref&#34;&gt;2&lt;/a&gt;&lt;/sup&gt;&lt;/p&gt;&#xA;&lt;p&gt;&lt;a href=&#34;https://melabit.com/img/2026-03-16-antigravity-l-llm-lo-fa-meglio/one-line-diff.png&#34;&gt;&lt;img src=&#34;https://melabit.com/img/2026-03-16-antigravity-l-llm-lo-fa-meglio/one-line-diff.png&#34; alt=&#34;&#34;&gt;&lt;/a&gt;&#xA;&amp;ndash; &lt;em&gt;The line highlighted in blue shows the only change that Gemini made to the &lt;code&gt;Arducam.py&lt;/code&gt; file; the original file is on the left, and the Gemini-modified version is on the right.&lt;/em&gt;&lt;/p&gt;&#xA;&lt;p&gt;All of other files provided by the manufacturer remained intact.&lt;/p&gt;&#xA;&lt;p&gt;As it turned out, the manufacturer&amp;rsquo;s driver worked perfectly with the Pico. Gemini simply learned how to use it to write the script that captures the image directly from the microcontroller (with a bit of patience, I could have done that myself).&lt;/p&gt;&#xA;&lt;h4 id=&#34;guide-the-llm-or-let-it-run-on-its-own&#34;&gt;Guide the LLM or let it run on its own?&lt;/h4&gt;&#xA;&lt;p&gt;The difference with respect to the &lt;a href=&#34;https://melabit.com/en/2026/03/04/antigravity-a-driver-written-by-ai/&#34;&gt;previous approach&lt;/a&gt; based on &lt;em&gt;vibe coding&lt;/em&gt; is striking.&lt;/p&gt;&#xA;&lt;p&gt;Then. Gemini worked frantically for hours and created such a mess that I was forced to restart my Mac twice. Meanwhile, it had convinced itself (and me, too) that the &lt;code&gt;Arducam.py&lt;/code&gt; driver was incompatible with the Pico and thus rewrote it from scratch, discarding an essential library for communicating with the camera (the library marked by the red arrow),&lt;/p&gt;&#xA;&lt;p&gt;&lt;a href=&#34;https://melabit.com/img/2026-03-16-antigravity-l-llm-lo-fa-meglio/diff-libraries.png&#34;&gt;&lt;img src=&#34;https://melabit.com/img/2026-03-16-antigravity-l-llm-lo-fa-meglio/diff-libraries.png&#34; alt=&#34;&#34;&gt;&lt;/a&gt;&#xA;&amp;ndash; &lt;em&gt;Changes to the first few lines of the &lt;code&gt;Arducam.py&lt;/code&gt; file; original file on the left, Gemini&amp;rsquo;s modified version is on the right. The arrow marks the library that was removed by Gemini.&lt;/em&gt;&lt;/p&gt;&#xA;&lt;p&gt;and removing most of the functions defined in that file for no reason and without any real gain in efficiency or functionality (quite the opposite, actually!). The image below shows the difference in the length of the original code, on the left, compared to the modified version, on the right.&lt;/p&gt;&#xA;&lt;p&gt;&lt;a href=&#34;https://melabit.com/img/2026-03-16-antigravity-l-llm-lo-fa-meglio/diff-file-length.png&#34;&gt;&lt;img src=&#34;https://melabit.com/img/2026-03-16-antigravity-l-llm-lo-fa-meglio/diff-file-length.png&#34; alt=&#34;&#34;&gt;&lt;/a&gt;&#xA;&amp;ndash; &lt;em&gt;Changes to the &lt;code&gt;Arducam.py&lt;/code&gt; file. The red rectangles highlight the difference in the number of lines between the original code (left) and Gemini&amp;rsquo;s modified version (right).&lt;/em&gt;&lt;/p&gt;&#xA;&lt;p&gt;Not satisfied with that, Gemini also completely altered the &lt;code&gt;OV5642_reg.py&lt;/code&gt; file containing the camera configuration data, claiming  memory issues. These problems evidently didn&amp;rsquo;t exist, given that everything went smoothly in this second experiment using the manufacturer&amp;rsquo;s original file.&lt;/p&gt;&#xA;&lt;p&gt;&lt;a href=&#34;https://melabit.com/img/2026-03-16-antigravity-l-llm-lo-fa-meglio/diff-registers.png&#34;&gt;&lt;img src=&#34;https://melabit.com/img/2026-03-16-antigravity-l-llm-lo-fa-meglio/diff-registers.png&#34; alt=&#34;&#34;&gt;&lt;/a&gt;&#xA;&amp;ndash; &lt;em&gt;Changes to the &lt;code&gt;OV5642_reg.py&lt;/code&gt; file, containing the camera configuration data. The original configuration (left) was overridden by Gemini (right).&lt;/em&gt;&lt;/p&gt;&#xA;&lt;h4 id=&#34;what-about-arduino&#34;&gt;What About Arduino?&lt;/h4&gt;&#xA;&lt;p&gt;At this point, it was inevitable to ask Gemini to generate the Arduino driver, using the same information that I had already provided for CircuitPython.&lt;/p&gt;&#xA;&lt;p&gt;At this point, it was inevitable that I would ask Gemini to generate the Arduino driver using the same information that I had already provided for CircuitPython.&lt;/p&gt;&#xA;&lt;p&gt;Gemini struggled more this time, mostly because I had forced it to make only minimal changes to the original Arducam libraries. However, this constraint was impossible to satisfy since the libraries do not natively support the Raspberry Pi Pico.&lt;/p&gt;&#xA;&#xA;&lt;details class=&#34;answer&#34; open&gt;&#xA;    &lt;summary class=&#34;answer&#34;&gt;&#xA;        Response&#xA;    &lt;/summary&gt;&#xA;    &lt;div class=&#34;answer&#34;&gt;&#xA;        &#xA;    &#xA;    &lt;strong&gt;Researching Arducam Library Compatibility&lt;/strong&gt;&#xA;    I&amp;rsquo;ve thoroughly inspected both the original &lt;code&gt;ArduCAM&lt;/code&gt; library and the &lt;code&gt;Arducam_mini&lt;/code&gt; library. Neither version contains native support for the RP2040/RP2350 (Pico) architecture. Specifically, both libraries are missing the necessary definitions for &lt;code&gt;regtype&lt;/code&gt;, &lt;code&gt;regsize&lt;/code&gt;, &lt;code&gt;cbi&lt;/code&gt;, and &lt;code&gt;sbi&lt;/code&gt; for this platform, which is the root cause of the compilation errors.&#xA;&#xA;&lt;div&gt;&#xA;&lt;/details&gt;&#xA;&#xA;&lt;p&gt;Once the constraint was relaxed, things went smoothly, and in a short time, the driver modified specifically for the Pico and the related image acquisition program were ready.&lt;/p&gt;&#xA;&lt;h4 id=&#34;free-or-paid&#34;&gt;Free or paid?&lt;/h4&gt;&#xA;&lt;p&gt;Many argue that &lt;a href=&#34;https://www.oneusefulthing.org/p/a-guide-to-which-ai-to-use-in-the&#34;&gt;paid models are much better than free ones&lt;/a&gt;. Consequently, my &lt;em&gt;vibe coding&lt;/em&gt; problems would supposedly have disappeared if I had paid for Gemini or Claude.&lt;/p&gt;&#xA;&lt;p&gt;That&amp;rsquo;s not exactly how it went. During my testing, I mainly used the free version of Gemini to see what would happen to a non-professional user playing with these tools. However, when I ran out of tokens, I switched to the paid version of Gemini provided by the university, but I never noticed any significant differences. I can&amp;rsquo;t say much about Claude, because paying 200 dollars a month to use it seriously makes no sense for someone like me, who isn&amp;rsquo;t a professional programmer.&lt;/p&gt;&#xA;&lt;p&gt;What I did notice was that Claude Opus and Claude Sonnet were much better than Gemini at solving the problems that arose during code development. However, since I only used them once the work was already in progress, they might have just been helped by what Gemini had already done.&lt;/p&gt;&#xA;&lt;h4 id=&#34;conclusions&#34;&gt;Conclusions&lt;/h4&gt;&#xA;&lt;p&gt;It goes without saying that neither this experiment nor &lt;a href=&#34;https://melabit.com/en/2026/03/04/antigravity-a-driver-written-by-ai/&#34;&gt;the previous one&lt;/a&gt; are &lt;em&gt;scientific&lt;/em&gt;. For them to be considered as such, I would need to use different types of microcontrollers and modules, define a well-defined set of prompts, and measure how the agent responds to changing experimental conditions.&lt;/p&gt;&#xA;&lt;p&gt;Despite this, the experiment has one merit: it is fully &lt;em&gt;reproducible&lt;/em&gt;. Anyone can repeat the experiment using the same low-cost hardware and prompts to verify whether my findings are correct.&lt;sup id=&#34;fnref:3&#34;&gt;&lt;a href=&#34;#fn:3&#34; class=&#34;footnote-ref&#34; role=&#34;doc-noteref&#34;&gt;3&lt;/a&gt;&lt;/sup&gt;&lt;/p&gt;&#xA;&lt;p&gt;This is something that the &lt;em&gt;vibe coding&lt;/em&gt; apostles with tens of millions of views always carefully avoid proposing. Their posts are full of certainties and bold predictions yet they never provide any evidence to support their claims.&lt;/p&gt;&#xA;&lt;p&gt;Today, no one can deny that LLMs can be a great help during programming. However, it’s hard to believe that they can truly do so without any human intervention. At least until those apostles decide to provide us with clear, repeatable data to support their claims.&lt;/p&gt;&#xA;&lt;div class=&#34;footnotes&#34; role=&#34;doc-endnotes&#34;&gt;&#xA;&lt;hr&gt;&#xA;&lt;ol&gt;&#xA;&lt;li id=&#34;fn:1&#34;&gt;&#xA;&lt;p&gt;Although I know a little bit about programming, I had no idea how to develop a driver and didn&amp;rsquo;t want to spend days learning how to do it.&amp;#160;&lt;a href=&#34;#fnref:1&#34; class=&#34;footnote-backref&#34; role=&#34;doc-backlink&#34;&gt;&amp;#x21a9;&amp;#xfe0e;&lt;/a&gt;&lt;/p&gt;&#xA;&lt;/li&gt;&#xA;&lt;li id=&#34;fn:2&#34;&gt;&#xA;&lt;p&gt;Actually, this file is a library of functions. However, since microcontrollers don&amp;rsquo;t have an operating system, these functions access the hardware directly, just like the drivers that control the components of a computer. Therefore, in this case, the terms &lt;em&gt;library&lt;/em&gt; and &lt;em&gt;driver&lt;/em&gt; are interchangeable.&amp;#160;&lt;a href=&#34;#fnref:2&#34; class=&#34;footnote-backref&#34; role=&#34;doc-backlink&#34;&gt;&amp;#x21a9;&amp;#xfe0e;&lt;/a&gt;&lt;/p&gt;&#xA;&lt;/li&gt;&#xA;&lt;li id=&#34;fn:3&#34;&gt;&#xA;&lt;p&gt;The term &lt;em&gt;low cost&lt;/em&gt; must now be taken with a grain of salt. The shortage of electronic components has caused the price of the Pico to skyrocket, and even more so for the Arducam Mini 5MP Plus. The latter, which cost 40 dollars a few months ago, now costs more than triple that on Amazon or AliExpress. Fortunately, some retailers are maintaining more reasonable prices, at least for now.&amp;#160;&lt;a href=&#34;#fnref:3&#34; class=&#34;footnote-backref&#34; role=&#34;doc-backlink&#34;&gt;&amp;#x21a9;&amp;#xfe0e;&lt;/a&gt;&lt;/p&gt;&#xA;&lt;/li&gt;&#xA;&lt;/ol&gt;&#xA;&lt;/div&gt;&#xA;</description>
    </item>
    <item>
      <title>The Dory effect</title>
      <link>https://melabit.com/en/2026/03/09/the-dory-effect/</link>
      <pubDate>Mon, 09 Mar 2026 18:00:00 +0000</pubDate>
      <guid>https://melabit.com/en/2026/03/09/the-dory-effect/</guid>
      <description>&lt;p&gt;I am well aware that LLMs have &lt;a href=&#34;https://melabit.com/en/2026/02/16/antigravity-from-surprise-to-doubt/&#34;&gt;poor memory&lt;/a&gt;, but I never imagined that I would suffer the consequences so quickly.&lt;/p&gt;&#xA;&lt;p&gt;&lt;img src=&#34;https://melabit.com/img/2026-03-09-effetto-dory/Gemini_Generated_Image_q8d3k9q8d3k9q8d3.jpg&#34; alt=&#34;&#34;&gt;&#xA;&amp;ndash; &lt;em&gt;Immagine generata da &lt;a href=&#34;https://gemini.google.com&#34;&gt;Google Gemini&lt;/a&gt;.&lt;/em&gt;&lt;/p&gt;&#xA;&lt;p&gt;&lt;strong&gt;Note to the reader.&lt;/strong&gt; This article complements the previous one, &lt;a href=&#34;https://melabit.com/en/2026/03/04/antigravity-a-driver-written-by-ai/&#34;&gt;Antigravity: a driver written by AI&lt;/a&gt;, and should be read afterward. However, here&amp;rsquo;s a brief recap for the lazy readers.&lt;/p&gt;&#xA;&lt;p&gt;&lt;em&gt;Not knowing how to write a driver for a &lt;a href=&#34;https://en.wikipedia.org/wiki/Serial_Peripheral_Interface&#34;&gt;SPI camera&lt;/a&gt; attached to a Raspberry Pi Pico, I asked Gemini 3 Flash for help, working entirely inside Antigravity. After many hours, the agent managed to produce a working Arduino program. Then I made a fatal mistake by asking for even more without taking any precautions.&lt;/em&gt;&lt;/p&gt;&#xA;&lt;h4 id=&#34;premise&#34;&gt;Premise&lt;/h4&gt;&#xA;&lt;p&gt;The &lt;a href=&#34;https://melabit.com/en/2026/03/04/https://melabit.com/en/2026/03/04/antigravity-a-driver-written-by-ai/#ai-at-work&#34;&gt;working code&lt;/a&gt; generated with such great effort by &lt;code&gt;Gemini 3 Flash&lt;/code&gt; assumed the Pico acted as a bridge between my Mac Mini and the SPI camera.&lt;sup id=&#34;fnref:1&#34;&gt;&lt;a href=&#34;#fn:1&#34; class=&#34;footnote-ref&#34; role=&#34;doc-noteref&#34;&gt;1&lt;/a&gt;&lt;/sup&gt; On the Mac, a Python script commanded image capture, while the actual acquisition was performed by the Pico, running an Arduino program (a &lt;em&gt;sketch&lt;/em&gt; in platform jargon) that is essentially C/C++ code adapted for microcontrollers.&lt;/p&gt;&#xA;&lt;p&gt;Controlling everything from a PC works fine for early tests, but it becomes cumbersome when you want to use such a setup for fieldwork.&lt;/p&gt;&#xA;&lt;h4 id=&#34;automatic-captures&#34;&gt;Automatic captures&lt;/h4&gt;&#xA;&lt;p&gt;I asked Gemini to modify the code so it would automatically snap an image every N minutes or when a button is pressed, then send the picture to a computer over Wi‑Fi.&lt;sup id=&#34;fnref:2&#34;&gt;&lt;a href=&#34;#fn:2&#34; class=&#34;footnote-ref&#34; role=&#34;doc-noteref&#34;&gt;2&lt;/a&gt;&lt;/sup&gt;&lt;/p&gt;&#xA;&#xA;&lt;details class=&#34;question&#34; open&gt;&#xA;    &lt;summary class=&#34;question&#34;&gt;&#xA;        Prompt&#xA;    &lt;/summary&gt;&#xA;    &lt;div class=&#34;question&#34;&gt;&#xA;        &#xA;[...] However, this means that I need to use a Pico for the capture and another computer, even a Raspberry Pi Zero for commanding the capture process. Would it be possible to do everything on the Pico, eg press a button and perform the capture or via a timed process (defined in the .ino file itself) every N minutes (but there should be a mechanism that sends the images to a computer connected by WiFi, avoiding to fill the pico storage space).&#xA;&lt;div&gt;&#xA;&lt;/details&gt;&#xA;&#xA;&lt;p&gt;Gemini jumps straight into the work, still using Arduino. In principle, it should be a trivial task, but no matter how hard Gemini tries, the code never works. After many fruitless attempts I got fed up and asked it to revert to the last known‑good version of the program.&lt;/p&gt;&#xA;&lt;h4 id=&#34;backtracking&#34;&gt;Backtracking&lt;/h4&gt;&#xA;&lt;p&gt;And here Gemini hits an unexpected wall because, despite my guidance and constant suggestions, there is absolutely no way to succeed.&lt;/p&gt;&#xA;&lt;p&gt;Stupidly, I hadn&amp;rsquo;t kept track of the various code versions as they were generated, and that&amp;rsquo;s entirely my fault.&lt;/p&gt;&#xA;&lt;p&gt;On the other hand, even though I am well aware that large language models (LLMs) have a &lt;strong&gt;working memory&lt;/strong&gt; that resembles &lt;a href=&#34;https://youtu.be/3JNLwlcPBPI?si=lBu7revGCsYQ-nJY&amp;amp;t=62&#34;&gt;Dory&amp;rsquo;s, the blue tang fish&lt;/a&gt;, I would have never imagined that these ultra-sophisticated systems couldn&amp;rsquo;t even &lt;strong&gt;retrospectively reconstruct&lt;/strong&gt; what they&amp;rsquo;ve already done. Despite having dozens upon dozens of &lt;code&gt;implementation_plan.md&lt;/code&gt;, &lt;code&gt;task.md&lt;/code&gt;, and &lt;code&gt;walkthrough.md&lt;/code&gt; files that detail every previous step. I counted them: for this project alone, Antigravity produced over 250 distinct files.&lt;/p&gt;&#xA;&lt;p&gt;Going back to a prior state is something that tools like &lt;code&gt;Git&lt;/code&gt; and its predecessors &amp;ndash; tools that are entirely devoid of any &lt;em&gt;intelligence&lt;/em&gt; beyond that of their creators &amp;ndash; have been doing flawlessly for &lt;a href=&#34;https://blog.tarynmcmillan.com/a-history-of-version-control&#34;&gt;more than fifty years&lt;/a&gt;). Maybe developers of these &lt;em&gt;intelligent&lt;/em&gt; agents could give it a thought, huh?&lt;/p&gt;&#xA;&lt;h4 id=&#34;epilogue&#34;&gt;Epilogue&lt;/h4&gt;&#xA;&lt;p&gt;In the end, stubbornly relying on my notes,&lt;sup id=&#34;fnref:3&#34;&gt;&lt;a href=&#34;#fn:3&#34; class=&#34;footnote-ref&#34; role=&#34;doc-noteref&#34;&gt;3&lt;/a&gt;&lt;/sup&gt; I managed to get Gemini to repeat everything it had already done. It took hours, it reproduced all the earlier mistakes, but eventually the rebuilt code worked again.&lt;/p&gt;&#xA;&lt;p&gt;While I was at it, I also had Gemini redo the whole thing for CircuitPython, and that port required many other hours of work and errors. And before getting there, Gemini caused such severe issues that macOS had to disable USB communication with the Pico, forcing me to restart the Mac a couple of times.&lt;/p&gt;&#xA;&lt;p&gt;Needless to say, this time I had it keep track of every step with &lt;code&gt;git&lt;/code&gt; and I also put all the code on &lt;a href=&#34;https://github.com/sabinomaggi/pico-camera5mp&#34;&gt;GitHub&lt;/a&gt;, just in case.&lt;/p&gt;&#xA;&lt;p&gt;One of the next steps will (hopefully) be MicroPython, but that&amp;rsquo;s another story.&lt;/p&gt;&#xA;&lt;div class=&#34;footnotes&#34; role=&#34;doc-endnotes&#34;&gt;&#xA;&lt;hr&gt;&#xA;&lt;ol&gt;&#xA;&lt;li id=&#34;fn:1&#34;&gt;&#xA;&lt;p&gt;Why does Arduino struggle so much, despite being a far more mature microcontroller development platform than MicroPython or CircuitPython? Because the camera module I chose isn&amp;rsquo;t supported on the Raspberry Pi Pico; it only works with &lt;em&gt;classic&lt;/em&gt; Raspberry Pis (those running a full OS, such as the Pi 4, 5, or Zero), as well as with the Arduino and ESP286 families of microcontrollers.&amp;#160;&lt;a href=&#34;#fnref:1&#34; class=&#34;footnote-backref&#34; role=&#34;doc-backlink&#34;&gt;&amp;#x21a9;&amp;#xfe0e;&lt;/a&gt;&lt;/p&gt;&#xA;&lt;/li&gt;&#xA;&lt;li id=&#34;fn:2&#34;&gt;&#xA;&lt;p&gt;English isn&amp;rsquo;t great here; it&amp;rsquo;s more a stream of thoughts than a polished discourse. But I prefer jotting down whatever comes to mind while writing rather than losing the thread of ideas.&amp;#160;&lt;a href=&#34;#fnref:2&#34; class=&#34;footnote-backref&#34; role=&#34;doc-backlink&#34;&gt;&amp;#x21a9;&amp;#xfe0e;&lt;/a&gt;&lt;/p&gt;&#xA;&lt;/li&gt;&#xA;&lt;li id=&#34;fn:3&#34;&gt;&#xA;&lt;p&gt;I&amp;rsquo;ll never stress enough how important it is to keep written notes of what you&amp;rsquo;re doing. At the beginning it seems like a waste of time, yet it&amp;rsquo;s the best way to avoid losing a huge amount of time later.&amp;#160;&lt;a href=&#34;#fnref:3&#34; class=&#34;footnote-backref&#34; role=&#34;doc-backlink&#34;&gt;&amp;#x21a9;&amp;#xfe0e;&lt;/a&gt;&lt;/p&gt;&#xA;&lt;/li&gt;&#xA;&lt;/ol&gt;&#xA;&lt;/div&gt;&#xA;</description>
    </item>
    <item>
      <title>Antigravity: from surprise to doubt</title>
      <link>https://melabit.com/en/2026/02/16/antigravity-from-surprise-to-doubt/</link>
      <pubDate>Mon, 16 Feb 2026 18:00:00 +0000</pubDate>
      <guid>https://melabit.com/en/2026/02/16/antigravity-from-surprise-to-doubt/</guid>
      <description>&lt;p&gt;As effective as Antigravity may be, digging a little deeper reveals that the agent-based systems working inside it, while helpful and capable at answering &lt;a href=&#34;https://melabit.com/en/2026/01/21/an-unexpected-antigravity/&#34;&gt;many complex questions&lt;/a&gt;, are not exempt from the usual issues of the large language models (LLMs) we&amp;rsquo;ve been dealing with for the past three years.&lt;/p&gt;&#xA;&lt;p&gt;Let&amp;rsquo;s look at some practical examples.&lt;/p&gt;&#xA;&lt;p&gt;&lt;img src=&#34;https://melabit.com/img/2026-02-16-antigravity-dalla-sorpresa-al-dubbio/Gemini_Generated_Image_vzsj3tvzsj3tvzsj.jpg&#34; alt=&#34;&#34;&gt;&#xA;&amp;ndash; &lt;em&gt;Image generated by &lt;a href=&#34;https://gemini.google.com&#34;&gt;Google Gemini&lt;/a&gt;.&lt;/em&gt;&lt;/p&gt;&#xA;&lt;h4 id=&#34;example-1&#34;&gt;Example #1&lt;/h4&gt;&#xA;&lt;p&gt;After successfully trying to extract data from &lt;a href=&#34;https://melabit.com/en/2026/01/21/an-unexpected-antigravity/#task-1-extracting-data-from-images&#34;&gt;images in one of my articles&lt;/a&gt;, I immediately got to work trying to apply the same method to other figures similar to those analyzed the first time, such as &lt;a href=&#34;https://www.sciencedirect.com/science/article/pii/S0341816225007738&#34;&gt;this one&lt;/a&gt;.&lt;/p&gt;&#xA;&lt;img src=&#34;https://melabit.com/img/2026-02-16-antigravity-dalla-sorpresa-al-dubbio/1-s2.0-S0341816225007738-gr4_lrg.jpg&#34; alt=&#34;&#34;&gt;&lt;p&gt;The figure is clearly more complex than my previous ones: the top part features a histogram showing monthly precipitation and snow depth measurements for two locations with unpronounceable names in Québec, where different measurements are represented with varying shades and colors. Below it, and partially overlapping the histogram, there is a line graph with the average monthly temperatures for the two locations, with the vertical scale positioned on the right side.&lt;/p&gt;&#xA;&lt;p&gt;Even though the figure is more complex, I never expected &lt;code&gt;Gemini 3 Flash&lt;/code&gt; to struggle so much. It tries, and tries again, but every time something is off. The histogram isn&amp;rsquo;t a huge problem, but there&amp;rsquo;s no way to make it understand what to do with the data in the line graph. This issue persists even when I explain exactly where it should look to determine the temperature values for each month.&lt;/p&gt;&#xA;&#xA;&lt;details class=&#34;question&#34; open&gt;&#xA;    &lt;summary class=&#34;question&#34;&gt;&#xA;        Prompt&#xA;    &lt;/summary&gt;&#xA;    &lt;div class=&#34;question&#34;&gt;&#xA;        &#xA;Hint: To evaluate temperatures for a given month you should take a point on the temperature plot that is between the two precipitation bars &#xA;&lt;div&gt;&#xA;&lt;/details&gt;&#xA;&#xA;&lt;p&gt;I am truly surprised by Gemini&amp;rsquo;s difficulty in repeating and extending the results of &lt;a href=&#34;https://melabit.com/en/2026/01/21/an-unexpected-antigravity/#task-1-extracting-data-from-images&#34;&gt;my first test&lt;/a&gt;, even as it continues to reassure me, claiming it has &lt;em&gt;understood&lt;/em&gt; perfectly what I want and has performed a &lt;em&gt;meticulous&lt;/em&gt; job.&lt;/p&gt;&#xA;&#xA;&lt;details class=&#34;answer&#34; open&gt;&#xA;    &lt;summary class=&#34;answer&#34;&gt;&#xA;        Answer&#xA;    &lt;/summary&gt;&#xA;    &lt;div class=&#34;answer&#34;&gt;&#xA;        &#xA;I have completed a meticulous re-evaluation of Figure 4 using the &#34;midpoint&#34; method you suggested (reading the temperature at the horizontal center between the two monthly precipitation bars).&#xA;Final Corrections Applied: [...]&#xA;&lt;div&gt;&#xA;&lt;/details&gt;&#xA;&#xA;&lt;p&gt;Actually, it seems that Gemini has no idea what the &amp;ldquo;midpoint&amp;rdquo; method is and is just repeating what I say, changing only a few words. If it were otherwise, it would have resolved the issue in seconds, pulling out the right data just as it did the first time.&lt;/p&gt;&#xA;&lt;p&gt;In the end, I give up. It makes no sense to waste hours making an &lt;em&gt;intelligent agent&lt;/em&gt; do a job that would only take a few minutes by hand.&lt;/p&gt;&#xA;&lt;h4 id=&#34;example-2&#34;&gt;Example #2&lt;/h4&gt;&#xA;&lt;p&gt;I must admit I&amp;rsquo;m disappointed, but perhaps I expected too much from my intelligent agent.&lt;/p&gt;&#xA;&lt;p&gt;So, I try to make things easier by choosing a graph from an &lt;a href=&#34;https://pubs.aip.org/aip/apl/article-abstract/86/4/042501/117168/Overdamped-Nb-Al-AlOx-Nb-Josephson-junctions&#34;&gt;old article of mine&lt;/a&gt; that shouldn&amp;rsquo;t be too challenging for Gemini 3 Flash.&lt;/p&gt;&#xA;&lt;img src=&#34;https://melabit.com/img/2026-02-16-antigravity-dalla-sorpresa-al-dubbio/042501_fig2.png&#34; alt=&#34;&#34;&gt;&lt;p&gt;Instead, it gets stuck and makes incomprehensible mistakes. With a little help, he manages to pull out the data shown in this graph, which still has several missing points (highlighted by the colored boxes) or, even worse, points added quite imaginatively (such as the one indicated by the arrow).&lt;/p&gt;&#xA;&lt;img src=&#34;https://melabit.com/img/2026-02-16-antigravity-dalla-sorpresa-al-dubbio/recreate_figure_2.png&#34; alt=&#34;&#34;&gt;&lt;p&gt;Another disappointment. Why use an &lt;em&gt;intelligent agent&lt;/em&gt; if it&amp;rsquo;s faster to acquire the points by hand?&lt;/p&gt;&#xA;&lt;h4 id=&#34;example-3&#34;&gt;Example #3&lt;/h4&gt;&#xA;&lt;p&gt;But I don&amp;rsquo;t give up easily and try once more to get it to do something good. I use a figure from &lt;a href=&#34;https://pubs.aip.org/aip/apl/article-abstract/91/25/252505/334515/Nb-Al-AlOx-Nb-overdamped-Josephson-junctions-above&#34;&gt;another article&lt;/a&gt;, which is quite simple from a graphical point of view. However, it would be useful to be able to acquire it automatically, as the amount of data it contains makes it difficult to digitize accurately by hand.&lt;/p&gt;&#xA;&lt;img src=&#34;https://melabit.com/img/2026-02-16-antigravity-dalla-sorpresa-al-dubbio/252505_fig1.png&#34; alt=&#34;&#34;&gt;&lt;p&gt;Burned by the previous experiences, I first ask it to describe what it &lt;em&gt;sees&lt;/em&gt; in the figure.&lt;/p&gt;&#xA;&#xA;&lt;details class=&#34;question&#34; open&gt;&#xA;    &lt;summary class=&#34;question&#34;&gt;&#xA;        Prompt&#xA;    &lt;/summary&gt;&#xA;    &lt;div class=&#34;question&#34;&gt;&#xA;        &#xA;Now could you describe in simple words what you &#34;see&#34; in figure 1?&#xA;&lt;div&gt;&#xA;&lt;/details&gt;&#xA;&#xA;&lt;p&gt;The response is long, precise, and almost flawless,&lt;/p&gt;&#xA;&#xA;&lt;details class=&#34;answer&#34;&gt;&#xA;    &lt;summary class=&#34;answer&#34;&gt;&#xA;        Answer&#xA;    &lt;/summary&gt;&#xA;    &lt;div class=&#34;answer&#34;&gt;&#xA;        &#xA;    &#xA;    &lt;p&gt;Looking at figure 252505_1_f1.jpeg, I can see:&lt;/p&gt;&#xA;    &lt;p&gt;&lt;strong&gt;Two side-by-side graphs&lt;/strong&gt; showing current-voltage (I-V) characteristics:&lt;/p&gt;&#xA;    &lt;p&gt;&lt;strong&gt;(a) Left graph (T = 2.0 K)&lt;/strong&gt;:&lt;/p&gt;&#xA;    &lt;ul&gt;&#xA;    &lt;li&gt;Shows a &lt;strong&gt;hysteretic&lt;/strong&gt; behavior - the curve forms a loop&lt;/li&gt;&#xA;    &lt;li&gt;The current jumps abruptly at certain voltage points&lt;/li&gt;&#xA;    &lt;li&gt;There&amp;rsquo;s a clear gap between the upward and downward voltage sweeps&lt;/li&gt;&#xA;    &lt;li&gt;The vertical jumps occur around ±1 mV&lt;/li&gt;&#xA;    &lt;/ul&gt;&#xA;    &lt;p&gt;&lt;strong&gt;(b) Right graph (T = 5.8 K)&lt;/strong&gt;:&lt;/p&gt;&#xA;    &lt;p&gt;Shows &lt;strong&gt;non-hysteretic&lt;/strong&gt; behavior - smooth, continuous curve&lt;/p&gt;&#xA;    &lt;ul&gt;&#xA;    &lt;li&gt;No loop or gap between voltage sweeps&lt;/li&gt;&#xA;    &lt;li&gt;The transition is gradual rather than abrupt&lt;/li&gt;&#xA;    &lt;li&gt;The curve is more rounded at the critical points&lt;/li&gt;&#xA;    &lt;/ul&gt;&#xA;    &lt;p&gt;&lt;strong&gt;Key observation&lt;/strong&gt;: This demonstrates how the same Josephson junction behaves differently at different temperatures. At the lower temperature (2.0 K), it shows hysteresis (memory effect), while at the higher temperature (5.8 K), the hysteresis disappears and the junction becomes &amp;ldquo;overdamped.&amp;rdquo;&lt;/p&gt;&#xA;    &lt;p&gt;Both graphs show current ranging from about -15 to +15 mA on the y-axis, with voltage on the x-axis measured in millivolts (mV).&lt;/p&gt;&#xA;    &#xA;&#xA;&lt;div&gt;&#xA;&lt;/details&gt;&#xA;&#xA;&lt;p&gt;but, in reality, there is an error at the end of the response that would render any attempt to digitise the data in the two graphs impossible.&lt;/p&gt;&#xA;&lt;p&gt;When I point it out, Gemini 3 Flash corrects itself immediately and seems almost sorry. But, as we all know, LLMs are programmed to always please the user.&lt;/p&gt;&#xA;&lt;p&gt;I move forward and try to help it out, explaining exactly what it should do.&lt;/p&gt;&#xA;&#xA;&lt;details class=&#34;question&#34; open&gt;&#xA;    &lt;summary class=&#34;question&#34;&gt;&#xA;        Prompt&#xA;    &lt;/summary&gt;&#xA;    &lt;div class=&#34;question&#34;&gt;&#xA;        &#xA;No problem. Now could you extract the values of voltage and current from the two images? Please produce a table for each image, and export the table as a csv file and a markdown file, with suffix `_table`.&lt;/br&gt;  &#xA;Since one of the I-V curves is hysteretic, I suggest this strategy for extracting the data: start from V=0, increase V up to the maximum value following one of the paths, decrease V to 0 following the other path, do the same for the negative values of V.&#xA;&lt;div&gt;&#xA;&lt;/details&gt;&#xA;&#xA;&lt;p&gt;But despite the suggestion, there&amp;rsquo;s no way to get it to do a decent job. The best I get is this, which at first glance looks like a great result.&lt;/p&gt;&#xA;&lt;img src=&#34;https://melabit.com/img/2026-02-16-antigravity-dalla-sorpresa-al-dubbio/252505_1_f1_lowres_v22.png&#34; alt=&#34;&#34;&gt;&lt;p&gt;It&amp;rsquo;s a pity, though, that there&amp;rsquo;s a serious and incomprehensible error, which I&amp;rsquo;ll leave it for you to discover.&lt;/p&gt;&#xA;&lt;p&gt;In any case, the previous image is truly the best of the best of the best, and it was obtained after hours of failed attempts (like &lt;a href=&#34;https://melabit.com/img/2026-02-16-antigravity-dalla-sorpresa-al-dubbio/252505_1_f1_recreated.png&#34;&gt;this&lt;/a&gt; initial attempt, or &lt;a href=&#34;https://melabit.com/img/2026-02-16-antigravity-dalla-sorpresa-al-dubbio/252505_1_f1_lowres_v13.png&#34;&gt;this&lt;/a&gt;, or &lt;a href=&#34;https://melabit.com/img/2026-02-16-antigravity-dalla-sorpresa-al-dubbio/252505_1_f1_lowres_final.png&#34;&gt;this&lt;/a&gt; one) and of hints that went from vague to increasingly precise, much like with an unprepared student from whom you try to get a few words out. Not to mention responses that always tried to adapt to the current suggestion without ever, or almost ever, making use of what had been discussed previously.&lt;/p&gt;&#xA;&lt;p&gt;All these fruitless attempts were made even more frustrating by the fact that, several years ago, I had solved the same problem accurately and reproducibly with barely two hundred lines of Python code (&lt;a href=&#34;https://melabit.com/en/2026/01/21/an-unexpected-antigravity/#fn:3&#34;&gt;I remembered wrong&lt;/a&gt;, it wasn&amp;rsquo;t C but Python).&lt;/p&gt;&#xA;&lt;p&gt;And when I finish explaining exactly how it should be done, even suggesting that the key is to convert the original &lt;code&gt;JPEG&lt;/code&gt; file into a &lt;code&gt;PBM&lt;/code&gt; format (which is basically a grid of 0s or 1s depending on whether the pixel is white or black),&lt;/p&gt;&#xA;&#xA;&lt;details class=&#34;question&#34; open&gt;&#xA;    &lt;summary class=&#34;question&#34;&gt;&#xA;        Prompt&#xA;    &lt;/summary&gt;&#xA;    &lt;div class=&#34;question&#34;&gt;&#xA;        &#xA;    &#xA;    &lt;p&gt;My suggestion is to change focus completely. If I were you I would:&lt;/p&gt;&#xA;    &lt;ol&gt;&#xA;    &lt;li&gt;&#xA;    &lt;p&gt;take note of the extreme values of voltage and current on the X and Y axes;&lt;/p&gt;&#xA;    &lt;/li&gt;&#xA;    &lt;li&gt;&#xA;    &lt;p&gt;get rid of the extraneous information outside the plot area and of the text (if any) inside it;&lt;/p&gt;&#xA;    &lt;/li&gt;&#xA;    &lt;li&gt;&#xA;    &lt;p&gt;convert the image to the PBM (or PGM) format, that gives you a clean grid of black and white (or gray and white) pixels;&lt;/p&gt;&#xA;    &lt;/li&gt;&#xA;    &lt;li&gt;&#xA;    &lt;p&gt;remove the extraneous grid dots using a 3x3 or 5x5 filter (or anything like that);&lt;/p&gt;&#xA;    &lt;/li&gt;&#xA;    &lt;li&gt;&#xA;    &lt;p&gt;calculate the X and Y position of each black pixel, converting the [values] to Voltage and Current (it is just a proportion).&lt;/p&gt;&#xA;    &lt;/li&gt;&#xA;    &lt;/ol&gt;&#xA;    &lt;p&gt;Clearly you can change the order of the steps or do something different if you believe your approach is better than mine.&lt;/p&gt;&#xA;    &#xA;&#xA;&lt;div&gt;&#xA;&lt;/details&gt;&#xA;&#xA;&lt;p&gt;there&amp;rsquo;s still no way to get it to do something useful.&lt;/p&gt;&#xA;&lt;p&gt;What&amp;rsquo;s even more frustrating is that Gemini didn&amp;rsquo;t hesitate to try to cheat! When it found the original file of measured values that I was using to compare its results with the &lt;em&gt;real&lt;/em&gt; data, it tried to use that file to generate the data I was asking for. Clearly, the results were perfect this time, but did I need an artificial &lt;em&gt;intelligence&lt;/em&gt; to achieve that?&lt;/p&gt;&#xA;&lt;h4 id=&#34;further-examples&#34;&gt;Further examples&lt;/h4&gt;&#xA;&lt;p&gt;During the days I spent testing what I&amp;rsquo;ve described in this post, I also ran many other tests, such as asking it to write some simple LaTeX code or convert the tables contained in a few PDF files into &lt;code&gt;CSV&lt;/code&gt; or &lt;code&gt;Markdown&lt;/code&gt;.&lt;/p&gt;&#xA;&lt;p&gt;t handled LaTeX well, but often struggled with tables, even when the layout was simple. There&amp;rsquo;s no need to go into detail here, as there aren&amp;rsquo;t many differences compared to previous examples.&lt;/p&gt;&#xA;&lt;h4 id=&#34;conclusions&#34;&gt;Conclusions&lt;/h4&gt;&#xA;&lt;p&gt;After the excellent results &lt;a href=&#34;https://melabit.com/en/2026/01/21/an-unexpected-antigravity/#task-1-extracting-data-from-images&#34;&gt;described in the previous article&lt;/a&gt;, I was confident I could use the intelligent agents integrated into Antigravity to easily extract data buried in many scientific articles from a few years ago, which would have allowed me to perform new comparative analyses quickly and efficiently.&lt;/p&gt;&#xA;&lt;p&gt;Unfortunately, all the more in-depth tests described in this post turned out to be unsuccessful.&lt;/p&gt;&#xA;&lt;p&gt;The model used here, almost always Gemini 3 Flash (though I also briefly tried Gemini 3 Pro and Claude Sonnet 4.5),&lt;sup id=&#34;fnref:1&#34;&gt;&lt;a href=&#34;#fn:1&#34; class=&#34;footnote-ref&#34; role=&#34;doc-noteref&#34;&gt;1&lt;/a&gt;&lt;/sup&gt; showed &lt;strong&gt;inconsistent and unrepeatable behavior&lt;/strong&gt;. For example, it would use different methods to solve the same problem without it ever being clear whether the choice of an algorithm or library had a specific technical motivation or was just a matter of statistical probability in the generation of responses.&lt;/p&gt;&#xA;&lt;p&gt;Another serious issue was the &lt;strong&gt;lack of memory&lt;/strong&gt;, much like &lt;a href=&#34;https://youtu.be/3JNLwlcPBPI?si=AuWIQu80vuFst1KP&amp;amp;t=62&#34;&gt;Dory&lt;/a&gt;, the funny fish from the cartoons. Despite successfully solving the problem of extracting data from a histogram just a few days earlier, Gemini was unable to apply the same methodology to a similar case. On other occasions, once a problem was partially solved, Gemini would start working on it again from scratch instead of refining and improving the previous answer.&lt;/p&gt;&#xA;&lt;p&gt;But the problem that perhaps bothered me the most was the &lt;strong&gt;inability to generalize&lt;/strong&gt; the work already done. If you ask to extract data from an image, Gemini creates a program suited only for &lt;em&gt;that&lt;/em&gt; specific image, without ever trying to develop a general method applicable to other images of the same type. Similarly, if you ask it to generate a graph from a data file, Gemini creates a script suited only for &lt;em&gt;that&lt;/em&gt; file, which is nearly impossible to adapt to other similar cases.&lt;/p&gt;&#xA;&lt;p&gt;When humans tackle a problem, they tend to become fond of the solution and try to reuse and extend it to similar problems. An LLM doesn&amp;rsquo;t care; it reinvents the wheel at every new iteration. It seems the triumph of the throwaway culture, a waste of resources and intelligence (however artificial) taken to the extreme.&lt;/p&gt;&#xA;&lt;p&gt;I&amp;rsquo;m just a guy playing around with these tools. However, very serious scientific studies &amp;ndash; which, for instance, tried to use LLMs to rediscover &lt;a href=&#34;https://arxiv.org/abs/2507.06952v4&#34;&gt;Newton&amp;rsquo;s law of gravitation&lt;/a&gt; starting from the orbits of planets around the Sun, or to predict the &lt;a href=&#34;https://pubs.rsc.org/en/content/articlelanding/2025/dd/d5dd00176e&#34;&gt;behavior of a molecule&lt;/a&gt; based on different representations of its structure &amp;ndash; have reached the exact same conclusions.&lt;/p&gt;&#xA;&lt;blockquote&gt;&#xA;&lt;p&gt;&lt;em&gt;Our analysis reveals that these models instead behave as if&#xA;they develop task-specific heuristics that fail to generalize.&lt;/em&gt;&lt;sup id=&#34;fnref:2&#34;&gt;&lt;a href=&#34;#fn:2&#34; class=&#34;footnote-ref&#34; role=&#34;doc-noteref&#34;&gt;2&lt;/a&gt;&lt;/sup&gt;&lt;/p&gt;&lt;/blockquote&gt;&#xA;&lt;blockquote&gt;&#xA;&lt;p&gt;&lt;em&gt;Simply increasing model size does not improve performance or generalization ability.&lt;/em&gt;&lt;sup id=&#34;fnref:3&#34;&gt;&lt;a href=&#34;#fn:3&#34; class=&#34;footnote-ref&#34; role=&#34;doc-noteref&#34;&gt;3&lt;/a&gt;&lt;/sup&gt;&lt;/p&gt;&lt;/blockquote&gt;&#xA;&lt;p&gt;Lastly, even if  at first glance it seems otherwise, we must never forget that LLMs &lt;strong&gt;don&amp;rsquo;t actually understand&lt;/strong&gt; what they are being asked; they are just very good at finding a needle in a very, very, very large haystack. An LLM can explain in great detail what a &amp;ldquo;Josephson junction I-V characteristic&amp;rdquo; is because it has &lt;em&gt;read&lt;/em&gt; many books and articles on the subject. But when it comes to connecting the ideal concept of &amp;ldquo;I-V characteristic&amp;rdquo; with its experimental representation in a figure, it fails miserably, proving it is just a well-trained parrot.&lt;/p&gt;&#xA;&lt;p&gt;Someone claims that the latest generation of LLMs can write an &lt;a href=&#34;https://arstechnica.com/ai/2026/02/sixteen-claude-ai-agents-working-together-created-a-new-c-compiler/&#34;&gt;entire C compiler&lt;/a&gt; on their own or tens of thousands of lines of &lt;a href=&#34;https://shumer.dev/something-big-is-happening&#34;&gt;perfectly functioning code&lt;/a&gt;.&lt;/p&gt;&#xA;&lt;p&gt;Personally, I would be satisfied if these models could produce a data file or a table in a reproducible manner, without running into inexplicable errors or inconsistent results. They should also avoid awkward situations like the ones I experienced, where one day everything goes smoothly, and the next day, nothing works.&lt;/p&gt;&#xA;&lt;div class=&#34;footnotes&#34; role=&#34;doc-endnotes&#34;&gt;&#xA;&lt;hr&gt;&#xA;&lt;ol&gt;&#xA;&lt;li id=&#34;fn:1&#34;&gt;&#xA;&lt;p&gt;I also used the paid version of Gemini, kindly provided by my university, without noticing any differences compared to the free version. Those who claim that paid models do things the free ones can only dream of should provide solid evidence before being believed.&amp;#160;&lt;a href=&#34;#fnref:1&#34; class=&#34;footnote-backref&#34; role=&#34;doc-backlink&#34;&gt;&amp;#x21a9;&amp;#xfe0e;&lt;/a&gt;&lt;/p&gt;&#xA;&lt;/li&gt;&#xA;&lt;li id=&#34;fn:2&#34;&gt;&#xA;&lt;p&gt;K. Vafa &lt;em&gt;et al.&lt;/em&gt;, &lt;a href=&#34;https://arxiv.org/abs/2507.06952v4&#34;&gt;What Has a Foundation Model Found? Using Inductive Bias to Probe for World Models&lt;/a&gt;, arXiv 2507.06952v4, 2025.&amp;#160;&lt;a href=&#34;#fnref:2&#34; class=&#34;footnote-backref&#34; role=&#34;doc-backlink&#34;&gt;&amp;#x21a9;&amp;#xfe0e;&lt;/a&gt;&lt;/p&gt;&#xA;&lt;/li&gt;&#xA;&lt;li id=&#34;fn:3&#34;&gt;&#xA;&lt;p&gt;B. Yan &lt;em&gt;et al.&lt;/em&gt;, &lt;a href=&#34;https://pubs.rsc.org/en/content/articlelanding/2025/dd/d5dd00176e&#34;&gt;Inconsistency of LLMs in molecular representations&lt;/a&gt;, Digital Discovery vol. 4, 2025.&amp;#160;&lt;a href=&#34;#fnref:3&#34; class=&#34;footnote-backref&#34; role=&#34;doc-backlink&#34;&gt;&amp;#x21a9;&amp;#xfe0e;&lt;/a&gt;&lt;/p&gt;&#xA;&lt;/li&gt;&#xA;&lt;/ol&gt;&#xA;&lt;/div&gt;&#xA;</description>
    </item>
  </channel>
</rss>
