Sorry

This feed does not validate.

In addition, interoperability with the widest range of feed readers could be improved by implementing the following recommendations.

Source: http://planet.code4lib.org/atom.xml

  1. <?xml version="1.0" encoding="utf-8" standalone="yes" ?>
  2. <feed xmlns="http://www.w3.org/2005/Atom">
  3.  <title>Planet Code4Lib</title>
  4.  <link rel="alternate" type="text/html" href="http://planet.code4lib.org"/>
  5.  <id>https://planet.code4lib.org/</id>
  6.  <updated>2025-07-01T17:14:01+00:00</updated>
  7.  <generator uri="http://www.planetplanet.org/">http://intertwingly.net/code/venus/</generator>
  8.  
  9.    <entry>
  10.      <title>David Rosenthal: Tesla's Robotaxi Revolution!</title>
  11.      <link rel="alternate" type="text/html" href="https://blog.dshr.org/2025/07/teslas-robotaxi-revolution.html"/>
  12.      <id>tag:blogger.com,1999:blog-4503292949532760618.post-5122259739066938151</id>
  13.      <updated>2025-07-01T15:00:00+00:00</updated>
  14.      <content type="html">&lt;table cellpadding=&quot;0&quot; cellspacing=&quot;0&quot; class=&quot;tr-caption-container&quot; style=&quot;float: right;&quot;&gt;&lt;tbody&gt;&lt;tr&gt;&lt;td style=&quot;text-align: center;&quot;&gt;&lt;a href=&quot;https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEh0pvL_uFGu7LBhQPQLJY3_H47YRdvy0pqFHFElDI1QEdL4znjxWKrAY0NG64Q2zzG8MG7L2x3_63QYM1xRCJUPhcJH87F_IGSXskZyDT8ccx6KYQjLp4t5J8T41pISJmCmF9HFXYPFSMOJjJ6xTYSxv4FDL6gq4ZpCnqKcn4CVt53OOQ4E66AitntkGHSD/s1308/Screenshot%202025-06-27%20at%2014.41.48.png&quot; style=&quot;clear: right; margin-bottom: 1em; margin-left: auto; margin-right: auto;&quot;&gt;&lt;img border=&quot;0&quot; height=&quot;102&quot; src=&quot;https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEh0pvL_uFGu7LBhQPQLJY3_H47YRdvy0pqFHFElDI1QEdL4znjxWKrAY0NG64Q2zzG8MG7L2x3_63QYM1xRCJUPhcJH87F_IGSXskZyDT8ccx6KYQjLp4t5J8T41pISJmCmF9HFXYPFSMOJjJ6xTYSxv4FDL6gq4ZpCnqKcn4CVt53OOQ4E66AitntkGHSD/w200-h102/Screenshot%202025-06-27%20at%2014.41.48.png&quot; width=&quot;200&quot; /&gt;&lt;/a&gt;&lt;/td&gt;&lt;/tr&gt;&lt;tr&gt;&lt;td class=&quot;tr-caption&quot; style=&quot;text-align: center;&quot;&gt;&lt;a href=&quot;https://techcentral.co.za/meet-tesla-cybercab-autonomous-robo-taxi/253198/&quot;&gt;The mythical CyberCab&lt;/a&gt;&lt;/td&gt;&lt;/tr&gt;&lt;/tbody&gt;&lt;/table&gt;
  15. &lt;a href=&quot;https://x.com/ChrisO_wiki/status/1928011420177670418&quot;&gt;@ChrisO_wiki&lt;/a&gt; tweeted:&lt;br /&gt;
  16. &lt;blockquote&gt;
  17. How to tell if someone's bullshitting: watch for them to give a deadline that they repeatedly push back.
  18. &lt;/blockquote&gt;
  19. This was apropos of Donald Trump's approach to tariffs and Ukraine, but below the fold I apply the criterion to Elon Musk basing Tesla's future on its robotaxi service.&lt;br /&gt;
  20. &lt;span&gt;&lt;a name=&quot;more&quot;&gt;&lt;/a&gt;&lt;/span&gt;
  21. &lt;br /&gt;
  22. Jonathan V. Last's &lt;a href=&quot;https://www.thebulwark.com/p/elon-musk-self-driving-fsd-tesla-tony-stark-michael-scott&quot;&gt;&lt;i&gt;A Song of “Full Self-Driving”: Elon Isn’t Tony Stark. He’s Michael Scott.&lt;/i&gt;&lt;/a&gt; shows that Musk's bullshitting started almost a decade ago:&lt;br /&gt;
  23. &lt;blockquote&gt;
  24.  For years, Elon Musk has been promising that Teslas will operate completely autonomously in “Full Self Driving” (FSD) mode. And when I say years, I mean &lt;a href=&quot;https://motherfrunker.ca/fsd/&quot;&gt;years&lt;/a&gt;:&lt;br /&gt;
  25. &lt;ul&gt;
  26. &lt;li&gt;&lt;a href=&quot;https://fortune.com/2015/12/21/elon-musk-interview/&quot;&gt;December 2015&lt;/a&gt;: “We’re going to end up with complete autonomy, and I think we will have complete autonomy in approximately two years.”&lt;/li&gt;
  27. &lt;li&gt;&lt;a href=&quot;https://x.com/elonmusk/status/686279251293777920&quot;&gt;January 2016&lt;/a&gt;: “In ~2 years, summon should work anywhere connected by land &amp;amp; not blocked by borders, eg you’re in LA and the car is in NY.”&lt;/li&gt;
  28. &lt;li&gt;&lt;a href=&quot;https://youtu.be/wsixsRI-Sz4?si=Mylq3wbbVuENBWxL&amp;amp;t=4710&quot;&gt;June 2016&lt;/a&gt;: “I really would consider autonomous driving to be basically a solved problem. . . . I think we’re basically less than two years away from complete autonomy, complete—safer than a human. However regulators will take at least another year.”&lt;/li&gt;
  29. &lt;li&gt;&lt;a href=&quot;https://www.nbcnews.com/business/autos/driverless-tesla-will-travel-l-nyc-2017-says-musk-n670206&quot;&gt;October 2016&lt;/a&gt;: By the end of 2017 Tesla will demonstrate a fully autonomous drive from “a home in L.A., to Times Square . . . without the need for a single touch, including the charging.”&lt;/li&gt;
  30. &lt;li&gt;&lt;a href=&quot;https://electrek.co/2018/03/11/tesla-ceo-elon-musk-self-driving-next-year/&quot;&gt;March 2018&lt;/a&gt;: “I think probably by end of next year [end of 2019] self-driving will encompass essentially all modes of driving”&lt;/li&gt;
  31. &lt;li&gt;&lt;a href=&quot;https://www.youtube.com/watch?v=MOy3MTZ9Dyg&quot;&gt;February 2019&lt;/a&gt;: “I think we will be feature complete—full self-driving—this year. Meaning the car will be able to find you in a parking lot, pick you up, take you all the way to your destination without an intervention, this year.&quot;&lt;/li&gt;
  32. &lt;/ul&gt;
  33. &lt;/blockquote&gt;
  34. &lt;a href=&quot;https://motherfrunker.ca/fsd/&quot;&gt;@motherfrunker&quot;&lt;/a&gt; tracks this BS, and the most recent entry is:&lt;br /&gt;
  35. &lt;ul&gt;
  36. &lt;li&gt;&lt;a href=&quot;https://www.youtube.com/watch?v=Vwq_PhtvLwo&quot;&gt;January 2022&lt;/a&gt;: I will be shocked if we don't achieve FSD safer than a human this year&lt;/li&gt;
  37. &lt;/ul&gt;
  38. But finally, on June 22&lt;sup&gt;nd&lt;/sup&gt;, Tesla's robotaxi revolution arrived. Never one to miss an opportunity to &lt;a href=&quot;https://gizmodo.com/elon-musks-trillion-dollar-robotaxi-gamble-is-here-2000618771&quot;&gt;pump the stock with bullshit&lt;/a&gt;, Musk:&lt;br /&gt;
  39. &lt;blockquote&gt;
  40. envisions a future fleet, including a new “Cybercab” and “Robovan” with no steering wheels or pedals, that could boost Tesla’s market value by an astonishing $5 trillion to $10 trillion. On June 20, Tesla was worth $1.04 trillion
  41. &lt;/blockquote&gt;
  42. As usual, there are plenty of cult members &lt;a href=&quot;https://gizmodo.com/elon-musks-trillion-dollar-robotaxi-gamble-is-here-2000618771&quot;&gt;lapping up the BS&lt;/a&gt;:&lt;br /&gt;
  43. &lt;blockquote&gt;
  44. “My view is the golden age of autonomous vehicles starting on Sunday in Austin for Tesla,” said Wedbush analyst Dan Ives. “I believe it’s a trillion dollar valuation opportunity for Tesla.”
  45. &lt;/blockquote&gt;
  46. Dan Ives obviously only sipped 10-20% of Musk's CoolAid. Others &lt;a href=&quot;https://gizmodo.com/elon-musks-trillion-dollar-robotaxi-gamble-is-here-2000618771&quot;&gt;drank deeper&lt;/a&gt;:&lt;br /&gt;
  47. &lt;blockquote&gt;
  48. Investor Cathie Wood’s ARK Invest predicts robotaxis could account for 90% of Tesla’s profits by 2029. If they are right, this weekend’s launch was existential.
  49. &lt;/blockquote&gt;
  50. Tesla's net income from the trailing 12 months is &lt;a href=&quot;https://finance.yahoo.com/quote/TSLA/financials/&quot;&gt;around $6.1B&lt;/a&gt; and falling. Assuming, optimistically, that  they can continue to sell cars at the current rate, Cathie Woods is assuming that robotaxi &lt;i&gt;profits&lt;/i&gt; would be around $60B. Tesla's net margin is around 6%, so this implies revenue of almost $1T in 2029. Tesla charges $4.20/ride (ha! ha!), so this implies that they are delivering 231B rides/year, or around 23,000 times the rate of the entire robotaxi industry currently. Woods is projecting that in four year's time Tesla's robotaxi business will have almost as much revenue as Amazon ($638B), Microsoft ($245B) and Nvidia ($130B) &lt;i&gt;combined&lt;/i&gt;.&lt;br /&gt;
  51. &lt;br /&gt;
  52. Liam Denning's analysis in &lt;a href=&quot;https://www.bloomberg.com/opinion/articles/2025-06-27/tesla-s-800-billion-robotaxi-dream-is-finally-facing-reality&quot;&gt;&lt;i&gt;Tesla’s $800 Billion Robotaxi Dream Is Finally Facing Reality&lt;/i&gt;&lt;/a&gt; is only somewhat less optimistic:&lt;br /&gt;
  53. &lt;blockquote&gt;
  54. &quot;On generous assumptions, Tesla’s core EV business, generating 75% of gross profit but with falling sales, might be worth roughly $50 per share, only 15% of the current price. Much of the remainder relates to expectations around self driving. RBC Capital, for example, ascribes 59% of its price target, or $181 per share, to robotaxis and a further $53 to monetizing Full Self Driving technology. Combined, that is a cool $815 billion based on double-digit multiples ascribed to modeled revenue — not earnings — 10 to 15 years from now because, after all, it relates to businesses that barely make money today.&quot;
  55. &lt;/blockquote&gt;
  56. This all seems a tad optimistic, given the current state of Tesla's and the competition's robotaxi offerings.  Brad Templeton says &quot;&lt;a href=&quot;https://www.forbes.com/sites/bradtempleton/2025/06/22/tesla-misses-robotaxi-launch-date-goes-with-safety-drivers/&quot;&gt;pay no attention to the person in the passenger seat&lt;/a&gt;&quot;:&lt;br /&gt;
  57. &lt;blockquote&gt;
  58. Tesla’s much-anticipated June 22 “no one in the vehicle” “unsupervised” Robotaxi launch in Austin is not ready. Instead, Tesla is operating a limited service with Tesla employees on board the vehicle to maintain safety.&lt;br /&gt;
  59. ...&lt;br /&gt;
  60. Having an employee who can intervene on board, commonly called a safety driver, is the approach that every robocar company has used for testing, including testing of passenger operations. Most companies spend many years (Waymo spent a decade) testing with safety drivers, and once they are ready to take passengers, there are typically some number of years testing in that mode, though the path to removing the safety driver depends primarily on evaluation of the safety case for the vehicle, and less on the presence of passengers.&lt;br /&gt;
  61. &lt;br /&gt;
  62. In addition to Musk’s statements about the vehicle being unsupervised, with nobody inside, in general the removal of the safety driver is the biggest milestone in development of a true robotaxi, not an incremental step that can be ignored. As such, Tesla has yet to meet its goals.&lt;br /&gt;
  63. &lt;/blockquote&gt;
  64. Seven-and-a-half years after Musk's deadline for &quot;complete autonomy&quot; the best Tesla can do is a small robotaxi service for invited guests in a geofenced area of Austin with a safety driver in daylight. Waymo has &lt;a href=&quot;https://blog.dshr.org/feeds/posts/tesla_robotaxi_austin&quot;&gt;100 robotaxis in service in Austin&lt;/a&gt;. Three months ago &lt;a href=&quot;https://www.forbes.com/sites/bradtempleton/2025/03/07/robotaxis-mostly-waymo-are-giving-13-million-ridesmonth--why/&quot;&gt;Brad Templeton reported that&lt;/a&gt;:&lt;br /&gt;
  65. &lt;blockquote&gt;
  66. Waymo, the self-driving unit of Alphabet, announced recently that they are now providing 200,000 self-driving taxi rides every week with no safety driver in the car, only passengers.&lt;br /&gt;
  67. ...&lt;br /&gt;
  68. In China, though, several companies are giving rides with no safety driver. The dominant player is Baidu Apollo, which reports they did 1.1 million rides last quarter, which is 84,000 per week, and they now are all no-safety-driver. Pony.AI claims 26,000 per week, but it is not clear if all are with no safety driver. AutoX does not report numbers, but says it has 1,000 cars in operation. WeRide also does not report numbers.
  69. &lt;/blockquote&gt;
  70. It turns out that the safety driver is necessary. Craig Trudell and Kara Carlson's &lt;a href=&quot;https://www.bloomberg.com/news/articles/2025-06-23/tesla-robotaxi-videos-show-speeding-driving-into-wrong-lane&quot;&gt;&lt;i&gt;Tesla Robotaxi Incidents Draw Scrutiny From US Safety Agency&lt;/i&gt;&lt;/a&gt; reports on the first day of the robotaxi revolution:&lt;br /&gt;
  71. &lt;blockquote&gt;
  72. US auto safety regulators are looking into incidents where Tesla Inc.’s self-driving robotaxis appeared to violate traffic laws during the company’s first day offering paid rides in Austin.&lt;br /&gt;
  73. ...&lt;br /&gt;
  74. In one video taken by investor Rob Maurer, who used to host a Tesla podcast, a Model Y he’s riding in enters an Austin intersection in a left-turn-only lane. The Tesla hesitates to make the turn, swerves right and proceeds into an unoccupied lane meant for traffic moving in the opposite direction.&lt;br /&gt;
  75. &lt;br /&gt;
  76. A honking horn can be heard as the Tesla re-enters the correct lane over a double-yellow line, which drivers aren’t supposed to cross.&lt;br /&gt;
  77. &lt;br /&gt;
  78. In two other posts on X, initial riders in driverless Model Ys shared footage of Teslas speeding. A vehicle carrying Sawyer Merritt, a Tesla investor, reached 35 miles per hour shortly after passing a 30 miles per hour speed limit sign, a video he posted shows.
  79. &lt;/blockquote&gt;
  80. Tesla's level of incompetence is not a surprise. Tesla added &quot;(Supervised)&quot; to FSD in the US. They aren't allowed to call the technology &lt;a href=&quot;https://electrek.co/2025/03/25/tesla-drops-fsd-from-name-of-its-driver-assist-tech-in-china/&quot;&gt;&quot;Full Self-Driving&quot; in China&lt;/a&gt;. They recently rolled out &lt;a href=&quot;https://electrek.co/2025/03/25/tesla-drops-fsd-from-name-of-its-driver-assist-tech-in-china/&quot;&gt;&quot;Intelligent Assisted Driving&quot; in China&lt;/a&gt;:&lt;br /&gt;
  81. &lt;blockquote&gt;
  82. But immediately after that rollout, Tesla drivers started &lt;a href=&quot;https://electrek.co/2025/02/27/tesla-drivers-are-racking-up-fines-using-fsd-in-china/&quot;&gt;racking up fines&lt;/a&gt; for violating the law. Many roads in China are watched by CCTV cameras, and fines are automatically handed out to drivers to break the law.&lt;br /&gt;
  83. &lt;br /&gt;
  84. It’s clear that the system still needs more knowledge about Chinese roads in general, because it kept mistaking bike lanes for right turn lanes, etc. One driver racked up 7 tickets within the span of a single drive after driving through bike lanes and crossing over solid lines. If a driver gets enough points on their license, they could even have their license suspended.
  85. &lt;/blockquote&gt;
  86. Why did Tesla roll out their $8K &quot;Intelligent Assisted Driving&quot; in China? It might have something to do with &lt;a href=&quot;https://electrek.co/2025/03/25/tesla-drops-fsd-from-name-of-its-driver-assist-tech-in-china/&quot;&gt;this&lt;/a&gt;:&lt;br /&gt;
  87. &lt;blockquote&gt;
  88. BYD recently pushed a software update giving &lt;a href=&quot;https://electrek.co/2025/02/10/byds-most-affordable-evs-smart-driving-upgrade-free/&quot;&gt;smart driving features to all of its vehicles – for free&lt;/a&gt;.
  89. &lt;/blockquote&gt;
  90. There are already many competing robotaxi services in China. &lt;a href=&quot;https://www.channelnewsasia.com/east-asia/china-robotaxis-baidu-tesla-driverless-vehicles-4670506&quot;&gt;For example&lt;/a&gt;:&lt;br /&gt;
  91. &lt;blockquote&gt;
  92. Baidu is already operating robotaxi services in multiple cities in China. It provided close to 900,000 rides in the second quarter of the year, up 26 per cent year-on-year, according to its latest earnings call. More than 7 million robotaxi rides in total had been operated as of late July.
  93. &lt;/blockquote&gt;
  94. That was a year ago. It isn't just Waymo that is in a whole different robotaxi league than Tesla. And lets not talk about the fact that BYD, &lt;a href=&quot;https://blog.dshr.org/2025/04/elon-musk-threat-or-menace-part-6.html&quot;&gt;Xiaomi&lt;/a&gt; and others outsell Tesla in China because their products are better and cheaper. Tesla's response? Getting the White House to put a &lt;a href=&quot;https://www.bloomberg.com/news/articles/2025-03-26/trump-prepares-auto-tariff-announcement-as-soon-as-wednesday&quot;&gt;25% tariff on imported cars&lt;/a&gt;.&lt;br /&gt;
  95. &lt;br /&gt;</content>
  96.      <author>
  97.  <name>David. (noreply@blogger.com)</name>
  98. <uri>https://blog.dshr.org/</uri>
  99.      </author>
  100.    </entry>
  101.  
  102.    <entry>
  103.      <title>Digital Library Federation: DLF Digest: July 2025</title>
  104.      <link rel="alternate" type="text/html" href="https://www.diglib.org/dlf-digest-july-2025/"/>
  105.      <id>https://www.diglib.org/?p=86245</id>
  106.      <updated>2025-07-01T14:00:45+00:00</updated>
  107.      <content type="html">&lt;p&gt;&lt;i&gt;&lt;span style=&quot;font-weight: 400;&quot;&gt;A monthly round-up of news, upcoming &lt;/span&gt;&lt;/i&gt;&lt;a href=&quot;https://www.diglib.org/groups/&quot;&gt;&lt;i&gt;&lt;span style=&quot;font-weight: 400;&quot;&gt;working group&lt;/span&gt;&lt;/i&gt;&lt;/a&gt;&lt;i&gt;&lt;span style=&quot;font-weight: 400;&quot;&gt; meetings and events, and &lt;/span&gt;&lt;/i&gt;&lt;a href=&quot;https://www.clir.org/&quot;&gt;&lt;i&gt;&lt;span style=&quot;font-weight: 400;&quot;&gt;CLIR&lt;/span&gt;&lt;/i&gt;&lt;/a&gt;&lt;i&gt;&lt;span style=&quot;font-weight: 400;&quot;&gt; program updates from the &lt;/span&gt;&lt;/i&gt;&lt;a href=&quot;https://www.diglib.org/&quot;&gt;&lt;i&gt;&lt;span style=&quot;font-weight: 400;&quot;&gt;Digital Library Federation&lt;/span&gt;&lt;/i&gt;&lt;/a&gt;&lt;i&gt;&lt;span style=&quot;font-weight: 400;&quot;&gt;. &lt;/span&gt;&lt;/i&gt;&lt;a href=&quot;https://www.diglib.org/category/dlf-digest/&quot;&gt;&lt;i&gt;&lt;span style=&quot;font-weight: 400;&quot;&gt;See all past Digests here&lt;/span&gt;&lt;/i&gt;&lt;/a&gt;&lt;i&gt;&lt;span style=&quot;font-weight: 400;&quot;&gt;. &lt;/span&gt;&lt;/i&gt;&lt;/p&gt;
  108. &lt;p&gt;&lt;span style=&quot;font-weight: 400;&quot;&gt;Happy July, DLF Community! We hope you’re having a pleasant summer so far and that you’re staying cool however you can. Between vacations and time to relax, we hope to catch you at a DLF Working Group meeting sometime this month. And, because the fall will be here before we know it, we hope you’re making plans to come to the Forum and Learn@DLF this November – &lt;/span&gt;&lt;a href=&quot;https://forum2025.diglib.org/registration/&quot;&gt;&lt;span style=&quot;font-weight: 400;&quot;&gt;registration is open&lt;/span&gt;&lt;/a&gt;&lt;span style=&quot;font-weight: 400;&quot;&gt; and &lt;/span&gt;&lt;a href=&quot;https://dlfforum2025.sched.com/&quot;&gt;&lt;span style=&quot;font-weight: 400;&quot;&gt;all programs have been released&lt;/span&gt;&lt;/a&gt;&lt;span style=&quot;font-weight: 400;&quot;&gt; for what’s sure to be a wonderful week in colorful Colorado.&lt;/span&gt;&lt;/p&gt;
  109. &lt;p&gt;&lt;span style=&quot;font-weight: 400;&quot;&gt;See you soon!&lt;/span&gt;&lt;/p&gt;
  110. &lt;p&gt;&lt;span style=&quot;font-weight: 400;&quot;&gt;— Aliya from Team DLF&lt;/span&gt;&lt;/p&gt;
  111. &lt;p&gt; &lt;/p&gt;
  112. &lt;h2&gt;&lt;span style=&quot;font-weight: 400;&quot;&gt;This month’s news:&lt;/span&gt;&lt;/h2&gt;
  113. &lt;ul&gt;
  114. &lt;li&gt;&lt;b&gt;Forum program announced, registration open:&lt;/b&gt;&lt;span style=&quot;font-weight: 400;&quot;&gt; The &lt;/span&gt;&lt;a href=&quot;https://dlfforum2025.sched.com/&quot;&gt;&lt;span style=&quot;font-weight: 400;&quot;&gt;program&lt;/span&gt;&lt;/a&gt;&lt;span style=&quot;font-weight: 400;&quot;&gt; for the 2025 DLF Forum is now available. &lt;/span&gt;&lt;a href=&quot;https://forum2025.diglib.org/registration/&quot;&gt;&lt;span style=&quot;font-weight: 400;&quot;&gt;Register at the earlybird rate&lt;/span&gt;&lt;/a&gt;&lt;span style=&quot;font-weight: 400;&quot;&gt; to join us in Denver in November.&lt;/span&gt;&lt;/li&gt;
  115. &lt;li&gt;&lt;b&gt;Recordings now available: &lt;/b&gt;&lt;span style=&quot;font-weight: 400;&quot;&gt;Check out the recordings from &lt;/span&gt;&lt;a href=&quot;https://www.youtube.com/playlist?list=PLYPP1-8uH9c4tPJPPK7BL3WvHAqEpY-uj&quot;&gt;&lt;span style=&quot;font-weight: 400;&quot;&gt;IIIF Annual Conference&lt;/span&gt;&lt;/a&gt;&lt;span style=&quot;font-weight: 400;&quot;&gt; and the &lt;/span&gt;&lt;a href=&quot;https://www.youtube.com/@iipc8855/featured&quot;&gt;&lt;span style=&quot;font-weight: 400;&quot;&gt;IIPC Web Archiving Conference&lt;/span&gt;&lt;/a&gt;&lt;span style=&quot;font-weight: 400;&quot;&gt;, which both took place earlier this year.&lt;/span&gt;&lt;/li&gt;
  116. &lt;li&gt;&lt;b&gt;Opportunity&lt;/b&gt;&lt;span style=&quot;font-weight: 400;&quot;&gt;: &lt;/span&gt;&lt;a href=&quot;https://networks.h-net.org/h-net-spaces&quot;&gt;&lt;span style=&quot;font-weight: 400;&quot;&gt;H-NET Spaces&lt;/span&gt;&lt;/a&gt;&lt;span style=&quot;font-weight: 400;&quot;&gt; invites applications for its &lt;/span&gt;&lt;a href=&quot;https://networks.h-net.org/group/pages/20064279/spaces-cohort-program&quot;&gt;&lt;span style=&quot;font-weight: 400;&quot;&gt;Spaces Cohort Program&lt;/span&gt;&lt;/a&gt;&lt;span style=&quot;font-weight: 400;&quot;&gt;, which supports early-stage projects and/or scholars in need of support and hands-on training in DH methods. Applications due July 1.&lt;/span&gt;&lt;/li&gt;
  117. &lt;li&gt;&lt;b&gt;Free virtual event: &lt;/b&gt;&lt;span style=&quot;font-weight: 400;&quot;&gt;The Libraries &amp;amp; DH Special Interest Group of the &lt;/span&gt;&lt;a href=&quot;https://adho.org/&quot;&gt;&lt;span style=&quot;font-weight: 400;&quot;&gt;Alliance of Digital Humanities Organizations (ADHO)&lt;/span&gt;&lt;/a&gt;&lt;span style=&quot;font-weight: 400;&quot;&gt; is co-hosting &lt;/span&gt;&lt;a href=&quot;https://adholibdh.github.io/2025/03/27/dh2025/&quot;&gt;&lt;span style=&quot;font-weight: 400;&quot;&gt;Libraries &amp;amp; DH: Histories, Perspectives, Prospects Mini-Conference&lt;/span&gt;&lt;/a&gt;&lt;span style=&quot;font-weight: 400;&quot;&gt;, a free gathering at this summer’s DH2025 Conference in Lisbon, Portugal, on July 14.&lt;/span&gt;&lt;/li&gt;
  118. &lt;li&gt;&lt;b&gt;Call for climate-conscious bookworms: &lt;/b&gt;&lt;span style=&quot;font-weight: 400;&quot;&gt;DLF’s &lt;/span&gt;&lt;a href=&quot;https://www.diglib.org/groups/climate-justice-working-group/&quot;&gt;&lt;span style=&quot;font-weight: 400;&quot;&gt;Climate Justice Working Group&lt;/span&gt;&lt;/a&gt;&lt;span style=&quot;font-weight: 400;&quot;&gt; summer book group is meeting Tuesday, July 29 at 1pm ET. They’ll be discussing chapters 5&amp;amp;6 of &lt;/span&gt;&lt;a href=&quot;https://www.fulcrum.org/concern/monographs/p8418r08x&quot;&gt;&lt;span style=&quot;font-weight: 400;&quot;&gt;After Disruption: A Future for Cultural Memory&lt;/span&gt;&lt;/a&gt;&lt;span style=&quot;font-weight: 400;&quot;&gt; by Trevor Owens, which is available open access. All are welcome to join, even if you’re not a regular participant in the working group and/or missed the first discussions. &lt;/span&gt;&lt;a href=&quot;https://msu.zoom.us/meeting/register/acJSlkjzSV27kVzJ0gu8iw&quot;&gt;&lt;span style=&quot;font-weight: 400;&quot;&gt;Register here to join.&lt;/span&gt;&lt;/a&gt;&lt;b&gt;&lt;/b&gt;&lt;/li&gt;
  119. &lt;li&gt;&lt;b&gt;Office closure: &lt;/b&gt;&lt;span style=&quot;font-weight: 400;&quot;&gt;CLIR offices will be closed July 3-4 in observance of Independence Day.&lt;/span&gt;&lt;/li&gt;
  120. &lt;/ul&gt;
  121. &lt;p&gt; &lt;/p&gt;
  122. &lt;h2&gt;&lt;span style=&quot;font-weight: 400;&quot;&gt;This month’s open DLF group meetings:&lt;/span&gt;&lt;/h2&gt;
  123. &lt;p&gt;&lt;span style=&quot;font-weight: 400;&quot;&gt;For the most up-to-date schedule of DLF group meetings and events (plus NDSA meetings, conferences, and more), bookmark the &lt;/span&gt;&lt;a href=&quot;https://www.diglib.org/opportunities/calendar/&quot;&gt;&lt;span style=&quot;font-weight: 400;&quot;&gt;DLF Community Calendar&lt;/span&gt;&lt;/a&gt;&lt;span style=&quot;font-weight: 400;&quot;&gt;. &lt;/span&gt;&lt;b&gt;Meeting dates are subject to change.&lt;/b&gt;&lt;span style=&quot;font-weight: 400;&quot;&gt; Can’t find the meeting call-in information? Email us at &lt;/span&gt;&lt;a href=&quot;mailto:info@diglib.org&quot;&gt;&lt;span style=&quot;font-weight: 400;&quot;&gt;info@diglib.org&lt;/span&gt;&lt;/a&gt;&lt;span style=&quot;font-weight: 400;&quot;&gt;. Reminder: Team DLF working days are Monday through Thursday.&lt;/span&gt;&lt;/p&gt;
  124. &lt;ul&gt;
  125. &lt;li&gt;&lt;b&gt;DLF Born-Digital Access Working Group (BDAWG):&lt;/b&gt;&lt;span style=&quot;font-weight: 400;&quot;&gt; Tuesday, 7/1, 2pm ET / 11am PT&lt;/span&gt;&lt;/li&gt;
  126. &lt;li&gt;&lt;b&gt;DLF Digital Accessibility Working Group (DAWG):&lt;/b&gt;&lt;span style=&quot;font-weight: 400;&quot;&gt; Tuesday, 7/1, 2pm ET / 11am PT&lt;/span&gt;&lt;/li&gt;
  127. &lt;li&gt;&lt;b&gt;DLF AIG Cultural Assessment Working Group&lt;/b&gt;&lt;span style=&quot;font-weight: 400;&quot;&gt;: Monday, 7/14, 1:00pm ET /10am PT&lt;/span&gt;&lt;/li&gt;
  128. &lt;li&gt;&lt;b&gt;DLF AIG Metadata Assessment Working Group:&lt;/b&gt;&lt;span style=&quot;font-weight: 400;&quot;&gt; Thursday, 7/17, 1:15pm ET / 10:15am PT&lt;/span&gt;&lt;/li&gt;
  129. &lt;li&gt;&lt;b&gt;DLF AIG User Experience Working Group:&lt;/b&gt;&lt;span style=&quot;font-weight: 400;&quot;&gt; Friday, 7/18, 11am ET / 8am PT&lt;/span&gt;&lt;/li&gt;
  130. &lt;li&gt;&lt;b&gt;DLF Digital Accessibility Policy &amp;amp; Workflows Subgroup: &lt;/b&gt;&lt;span style=&quot;font-weight: 400;&quot;&gt;Friday, 7/25, 1pm ET / 10am PT&lt;/span&gt;&lt;/li&gt;
  131. &lt;li style=&quot;font-weight: 400;&quot;&gt;&lt;b&gt;DLF Digital Accessibility Working Group IT &amp;amp; Development (DAWG-IT) Subgroup: &lt;/b&gt;&lt;span style=&quot;font-weight: 400;&quot;&gt;Monday, 7/28, 1:15pm ET / 10:15am PT&lt;/span&gt;&lt;/li&gt;
  132. &lt;li style=&quot;font-weight: 400;&quot;&gt;&lt;b&gt;DLF Digitization Interest Group: &lt;/b&gt;&lt;span style=&quot;font-weight: 400;&quot;&gt;Monday, 7/28, 2pm ET / 11am PT&lt;/span&gt;&lt;/li&gt;
  133. &lt;li style=&quot;font-weight: 400;&quot;&gt;&lt;b&gt;DLF Committee for Equity and Inclusion: &lt;/b&gt;&lt;span style=&quot;font-weight: 400;&quot;&gt;Monday, 7/28, 3pm ET / 12pm PT&lt;/span&gt;&lt;/li&gt;
  134. &lt;li style=&quot;font-weight: 400;&quot;&gt;&lt;b&gt;DLF Climate Justice Working Group: &lt;/b&gt;&lt;span style=&quot;font-weight: 400;&quot;&gt;Tuesday,&lt;/span&gt; &lt;span style=&quot;font-weight: 400;&quot;&gt;7/29, 1pm ET / 10am PT&lt;/span&gt;&lt;/li&gt;
  135. &lt;/ul&gt;
  136. &lt;p&gt; &lt;/p&gt;
  137. &lt;p&gt;&lt;i&gt;&lt;span style=&quot;font-weight: 400;&quot;&gt;DLF groups are open to ALL, regardless of whether or not you’re affiliated with a &lt;/span&gt;&lt;/i&gt;&lt;a href=&quot;https://www.diglib.org/about/members/&quot;&gt;&lt;i&gt;&lt;span style=&quot;font-weight: 400;&quot;&gt;DLF member organization&lt;/span&gt;&lt;/i&gt;&lt;/a&gt;&lt;i&gt;&lt;span style=&quot;font-weight: 400;&quot;&gt;. &lt;/span&gt;&lt;/i&gt;&lt;a href=&quot;https://www.diglib.org/groups/&quot;&gt;&lt;i&gt;&lt;span style=&quot;font-weight: 400;&quot;&gt;Learn more about our working groups on our website&lt;/span&gt;&lt;/i&gt;&lt;/a&gt;&lt;i&gt;&lt;span style=&quot;font-weight: 400;&quot;&gt;. Interested in scheduling an upcoming working group call or reviving a &lt;/span&gt;&lt;/i&gt;&lt;a href=&quot;https://www.diglib.org/groups/past/&quot;&gt;&lt;i&gt;&lt;span style=&quot;font-weight: 400;&quot;&gt;past group&lt;/span&gt;&lt;/i&gt;&lt;/a&gt;&lt;i&gt;&lt;span style=&quot;font-weight: 400;&quot;&gt;? &lt;/span&gt;&lt;/i&gt;&lt;a href=&quot;https://www.diglib.org/dlf-organizers-toolkit/&quot;&gt;&lt;i&gt;&lt;span style=&quot;font-weight: 400;&quot;&gt;Check out the DLF Organizer’s Toolkit&lt;/span&gt;&lt;/i&gt;&lt;/a&gt;&lt;i&gt;&lt;span style=&quot;font-weight: 400;&quot;&gt;. As always, feel free to get in touch at &lt;/span&gt;&lt;/i&gt;&lt;a href=&quot;mailto:info@diglib.org&quot;&gt;&lt;i&gt;&lt;span style=&quot;font-weight: 400;&quot;&gt;info@diglib.org&lt;/span&gt;&lt;/i&gt;&lt;/a&gt;&lt;i&gt;&lt;span style=&quot;font-weight: 400;&quot;&gt;. &lt;/span&gt;&lt;/i&gt;&lt;/p&gt;
  138. &lt;p&gt; &lt;/p&gt;
  139. &lt;h2&gt;&lt;span style=&quot;font-weight: 400;&quot;&gt;Get Involved / Connect with Us&lt;/span&gt;&lt;/h2&gt;
  140. &lt;p&gt;&lt;span style=&quot;font-weight: 400;&quot;&gt;Below are some ways to stay connected with us and the digital library community: &lt;/span&gt;&lt;/p&gt;
  141. &lt;ul&gt;
  142. &lt;li style=&quot;font-weight: 400;&quot;&gt;&lt;a href=&quot;https://share.hsforms.com/1MhcafbpARxGCIS1OQD6rKgc21y3&quot;&gt;&lt;b&gt;Subscribe&lt;/b&gt;&lt;span style=&quot;font-weight: 400;&quot;&gt; to the DLF Forum newsletter&lt;/span&gt;&lt;/a&gt;&lt;span style=&quot;font-weight: 400;&quot;&gt;.&lt;/span&gt;&lt;/li&gt;
  143. &lt;li style=&quot;font-weight: 400;&quot;&gt;&lt;b&gt;Join, start, or revive&lt;/b&gt;&lt;span style=&quot;font-weight: 400;&quot;&gt; a working group and &lt;/span&gt;&lt;a href=&quot;https://wiki.diglib.org/Main_Page&quot;&gt;&lt;span style=&quot;font-weight: 400;&quot;&gt;browse their work on the DLF Wiki&lt;/span&gt;&lt;/a&gt;&lt;span style=&quot;font-weight: 400;&quot;&gt;.&lt;/span&gt;&lt;/li&gt;
  144. &lt;li style=&quot;font-weight: 400;&quot;&gt;&lt;a href=&quot;https://lists.clir.org/cgi-bin/wa?A0=DLF-ANNOUNCE&quot;&gt;&lt;b&gt;Subscribe&lt;/b&gt;&lt;span style=&quot;font-weight: 400;&quot;&gt; to our community listserv, DLF-Announce&lt;/span&gt;&lt;/a&gt;&lt;span style=&quot;font-weight: 400;&quot;&gt;.&lt;/span&gt;&lt;/li&gt;
  145. &lt;li style=&quot;font-weight: 400;&quot;&gt;&lt;a href=&quot;https://digital-conferences-calendar.info/&quot;&gt;&lt;b&gt;Bookmark&lt;/b&gt;&lt;span style=&quot;font-weight: 400;&quot;&gt; our Community Calendar&lt;/span&gt;&lt;/a&gt;&lt;span style=&quot;font-weight: 400;&quot;&gt;.&lt;/span&gt;&lt;/li&gt;
  146. &lt;li style=&quot;font-weight: 400;&quot;&gt;&lt;a href=&quot;https://www.diglib.org/about/join/&quot;&gt;&lt;b&gt;Learn more&lt;/b&gt;&lt;span style=&quot;font-weight: 400;&quot;&gt; about becoming a DLF member organization&lt;/span&gt;&lt;/a&gt;&lt;span style=&quot;font-weight: 400;&quot;&gt;. &lt;/span&gt;&lt;/li&gt;
  147. &lt;li style=&quot;font-weight: 400;&quot;&gt;&lt;b&gt;Follow us&lt;/b&gt;&lt;span style=&quot;font-weight: 400;&quot;&gt; on &lt;/span&gt;&lt;a href=&quot;https://www.linkedin.com/company/digital-library-federation/&quot;&gt;&lt;span style=&quot;font-weight: 400;&quot;&gt;LinkedIn&lt;/span&gt;&lt;/a&gt;&lt;span style=&quot;font-weight: 400;&quot;&gt; and &lt;/span&gt;&lt;a href=&quot;https://www.youtube.com/user/DLFCLIR&quot;&gt;&lt;span style=&quot;font-weight: 400;&quot;&gt;YouTube&lt;/span&gt;&lt;/a&gt;&lt;/li&gt;
  148. &lt;li style=&quot;font-weight: 400;&quot;&gt;&lt;b&gt;Contact us&lt;/b&gt;&lt;span style=&quot;font-weight: 400;&quot;&gt; at &lt;/span&gt;&lt;a href=&quot;mailto:info@diglib.org&quot;&gt;&lt;span style=&quot;font-weight: 400;&quot;&gt;info@diglib.org&lt;/span&gt;&lt;/a&gt;&lt;span style=&quot;font-weight: 400;&quot;&gt;.&lt;/span&gt;&lt;/li&gt;
  149. &lt;/ul&gt;
  150. &lt;p&gt;The post &lt;a href=&quot;https://www.diglib.org/dlf-digest-july-2025/&quot; rel=&quot;nofollow&quot;&gt;DLF Digest: July 2025&lt;/a&gt; appeared first on &lt;a href=&quot;https://www.diglib.org&quot; rel=&quot;nofollow&quot;&gt;DLF&lt;/a&gt;.&lt;/p&gt;</content>
  151.      <author>
  152.  <name>arubin</name>
  153. <uri>https://www.diglib.org</uri>
  154.      </author>
  155.    </entry>
  156.  
  157.    <entry>
  158.      <title>HangingTogether: Roles for resource sharing practitioners in making library materials accessible</title>
  159.      <link rel="alternate" type="text/html" href="https://hangingtogether.org/roles-for-resource-sharing-practitioners-in-making-library-materials-accessible/"/>
  160.      <id>https://hangingtogether.org/?p=16368</id>
  161.      <updated>2025-07-01T12:00:00+00:00</updated>
  162.      <content type="html">&lt;figure class=&quot;wp-block-image size-full&quot;&gt;&lt;a href=&quot;https://hangingtogether.org/wp-content/uploads/2025/06/image.png&quot;&gt;&lt;img alt=&quot;Two hands, one with darker skin and the other with lighter skin, pressing their thumbs together in a gesture of unity or agreement. The background features a pattern of glowing blue hexagons, creating a futuristic and collaborative atmosphere.&quot; class=&quot;wp-image-16374&quot; height=&quot;648&quot; src=&quot;https://hangingtogether.org/wp-content/uploads/2025/06/image.png&quot; width=&quot;975&quot; /&gt;&lt;/a&gt;&lt;/figure&gt;
  163.  
  164.  
  165.  
  166. &lt;p&gt;&lt;em&gt;&lt;sup&gt;Image from pixabay.com&lt;/sup&gt;&lt;/em&gt;&lt;/p&gt;
  167.  
  168.  
  169.  
  170. &lt;p&gt;Libraries play a crucial role in ensuring equitable access to information, yet many collected materials remain inaccessible to patrons with disabilities. How can resource-sharing practitioners leverage their expertise and systems to bridge this gap? This question was at the heart of a recent &lt;a href=&quot;https://www.oclc.org/research/partnership.html&quot;&gt;OCLC Research Library Partnership (RLP)&lt;/a&gt; Works in Progress Webinar, &lt;a href=&quot;https://www.oclc.org/research/events/2025/accessibility-of-library-materials-roles-ill.html&quot;&gt;Increasing the accessibility of library materials—Roles for ILL&lt;/a&gt;, where experts from three large academic libraries shared their practices and insights for improving collection accessibility for users who need accommodations.&lt;/p&gt;
  171.  
  172.  
  173.  
  174. &lt;p&gt;Accessibility has long been a topic of interest and action for the &lt;a href=&quot;https://www.oclc.org/research/activities/shares.html&quot;&gt;SHARES&lt;/a&gt; resource sharing consortium, and all three speakers were from SHARES institutions. As the consortium’s coordinator, I introduced the session by highlighting some of the group’s previous work on accessibility, which included surveying members about current practices, challenges, and aspirations around accessibility, creating a &lt;a href=&quot;https://www.oclc.org/content/dam/research/documents/shares/SHARES-ILL-accessibility-resources.pdf&quot;&gt;resource document &lt;/a&gt;on accessibility and ILL, and drafting accessibility provisions that were incorporated into the latest revision of the &lt;a href=&quot;https://www.ala.org/rusa/interlibrary-loan-code-united-states-explanatory-text&quot;&gt;US Interlibrary Loan (ILL) code&lt;/a&gt; in 2023. &lt;/p&gt;
  175.  
  176.  
  177.  
  178. &lt;p&gt;Next our three distinguished presenters took to the virtual stage:&lt;/p&gt;
  179.  
  180.  
  181.  
  182. &lt;ul class=&quot;wp-block-list&quot;&gt;
  183. &lt;li&gt;&lt;strong&gt;Clara Fehrenbach&lt;/strong&gt;, Document Delivery Services Librarian at the University of Chicago Library&lt;/li&gt;
  184.  
  185.  
  186.  
  187. &lt;li&gt;&lt;strong&gt;Ronald Figueroa&lt;/strong&gt;, Resource Sharing and Facility Manager at Syracuse University Libraries&lt;/li&gt;
  188.  
  189.  
  190.  
  191. &lt;li&gt;&lt;strong&gt;Brynne Norton&lt;/strong&gt;, Head of Resource Sharing &amp;amp; Reserves at the University of Maryland Libraries&lt;/li&gt;
  192. &lt;/ul&gt;
  193.  
  194.  
  195.  
  196. &lt;p&gt;&lt;strong&gt;Key takeaways&lt;/strong&gt;&lt;/p&gt;
  197.  
  198.  
  199.  
  200. &lt;p&gt;The session highlighted various models and strategies for enhancing accessibility in library collections. Here are key insights shared by the presenters:&lt;/p&gt;
  201.  
  202.  
  203.  
  204. &lt;ol class=&quot;wp-block-list&quot; start=&quot;1&quot;&gt;
  205. &lt;li&gt;&lt;strong&gt;Making procurement seamless for qualified patrons&lt;/strong&gt;: Clara Fehrenbach discussed the University of Chicago’s partnership with &lt;a href=&quot;https://disabilities.uchicago.edu/&quot;&gt;Student Disability Services (SDS)&lt;/a&gt;. This collaboration allows students to request alternate formats directly through the library catalog, ensuring privacy and streamlined access to necessary materials. The library scans materials that SDS can’t source elsewhere, while SDS evaluates and authorizes patron eligibility and does the actual document remediation.&lt;br /&gt;&lt;/li&gt;
  206.  
  207.  
  208.  
  209. &lt;li&gt;&lt;strong&gt;Providing PDFs of hard-to-get material still in copyright&lt;/strong&gt;: Brynne Norton outlined the &lt;a href=&quot;https://www.lib.umd.edu/find/request-digital&quot;&gt;Accessible Library Text Retrieval Program (ALTR)&lt;/a&gt; at the University of Maryland. This program provides text-searchable PDFs of in-copyright library materials for students with visual impairments and other disabilities, serving as a last resort when other accessible formats are unavailable. Accessibility and Disability Service (ADS) staff determine who qualifies for this service as an accommodation.  &lt;br /&gt;&lt;/li&gt;
  210.  
  211.  
  212.  
  213. &lt;li&gt;&lt;strong&gt;Offering multiple levels of remediation&lt;/strong&gt;: Ronald Figueroa outlined &lt;a href=&quot;https://disabilityresources.syr.edu/cdr-services/&quot;&gt;Alternate Format Services (AFS)&lt;/a&gt; at Syracuse University, which provides alternate formats for items owned, licensed, or obtained via ILL for qualified patrons. Service eligibility is determined by the Center for Disability Resources for students and by the ADA Coordinator for faculty and staff. AFS remediates for magnification, text-to-speech, or screen readers, according to need, and outsources jobs over 200 pages.&lt;/li&gt;
  214. &lt;/ol&gt;
  215.  
  216.  
  217.  
  218. &lt;p&gt;&lt;strong&gt;Practical tips for ILL practitioners&lt;/strong&gt;&lt;/p&gt;
  219.  
  220.  
  221.  
  222. &lt;p&gt;The presenters shared practical advice for libraries looking to start offering accessibility services or enhance an existing service:&lt;/p&gt;
  223.  
  224.  
  225.  
  226. &lt;ul class=&quot;wp-block-list&quot;&gt;
  227. &lt;li&gt;&lt;strong&gt;Start small&lt;/strong&gt;: Begin with basic services and gradually expand capabilities based on available resources.&lt;/li&gt;
  228.  
  229.  
  230.  
  231. &lt;li&gt;&lt;strong&gt;Understand patron needs&lt;/strong&gt;: Tailor services to meet the specific needs of patrons, whether it’s OCR documents, accessible PDFs, or other formats.&lt;/li&gt;
  232.  
  233.  
  234.  
  235. &lt;li&gt;&lt;strong&gt;Leverage partnerships&lt;/strong&gt;: Collaborate with Disability Services Offices (DSOs) on campus to determine eligibility and streamline the provision of accessible materials.&lt;/li&gt;
  236.  
  237.  
  238.  
  239. &lt;li&gt;&lt;strong&gt;Maintain communication&lt;/strong&gt;: Keep in close touch with partners to ensure ongoing support and address any changes in staff or procedures.&lt;/li&gt;
  240. &lt;/ul&gt;
  241.  
  242.  
  243.  
  244. &lt;p&gt;They also offered guiding principles for those who might be feeling overwhelmed by the prospect of starting up a new accessibility service:&lt;/p&gt;
  245.  
  246.  
  247.  
  248. &lt;ul class=&quot;wp-block-list&quot;&gt;
  249. &lt;li&gt;&lt;strong&gt;Let those who are already good at it do it: &lt;/strong&gt;ILL = scanning; Student Disability Services = eligibility.&lt;/li&gt;
  250.  
  251.  
  252.  
  253. &lt;li&gt;&lt;strong&gt;Don’t overpromise:&lt;/strong&gt; Understand what you actually have the bandwidth to offer before partnering.&lt;/li&gt;
  254.  
  255.  
  256.  
  257. &lt;li&gt;&lt;strong&gt;Don’t be shy:&lt;/strong&gt; Know that Student Disability Services folks are eager to partner.&lt;/li&gt;
  258.  
  259.  
  260.  
  261. &lt;li&gt;&lt;strong&gt;Tap into your ILL community:&lt;/strong&gt; Ask your peers for help.&lt;/li&gt;
  262. &lt;/ul&gt;
  263.  
  264.  
  265.  
  266. &lt;p&gt;&lt;strong&gt;Looking ahead&lt;/strong&gt;&lt;/p&gt;
  267.  
  268.  
  269.  
  270. &lt;p&gt;The webinar also looked to the horizon for upcoming developments in accessibility, including the integration of Optical Character Recognition (OCR) into OCLC’s Article Exchange document delivery application and burgeoning efforts by the ALA RUSA STARS Codes and Guidelines Committee to establish scanning standards, with a focus on improving scanning for accessibility. These initiatives are crucial for ensuring that all patrons have equitable access to library resources.&lt;/p&gt;
  271.  
  272.  
  273.  
  274. &lt;p&gt;This webinar provided valuable insights and practical strategies for improving accessibility in library collections. By leveraging collaborative efforts, specialized programs, and efficient workflows, libraries can make significant strides in ensuring that all patrons, regardless of their abilities, have access to the information they need. We invite you to learn more by watching the recorded webinar, and exploring the wealth of resources shared on the recording &lt;a href=&quot;https://www.oclc.org/research/events/2025/accessibility-of-library-materials-roles-ill.html&quot;&gt;webpage&lt;/a&gt;.&lt;/p&gt;
  275.  
  276.  
  277.  
  278. &lt;figure class=&quot;wp-block-embed is-type-video is-provider-vimeo wp-block-embed-vimeo wp-embed-aspect-16-9 wp-has-aspect-ratio&quot;&gt;&lt;div class=&quot;wp-block-embed__wrapper&quot;&gt;
  279.  
  280. &lt;/div&gt;&lt;/figure&gt;
  281. &lt;p&gt;The post &lt;a href=&quot;https://hangingtogether.org/roles-for-resource-sharing-practitioners-in-making-library-materials-accessible/&quot;&gt;Roles for resource sharing practitioners in making library materials accessible&lt;/a&gt; appeared first on &lt;a href=&quot;https://hangingtogether.org&quot;&gt;Hanging Together&lt;/a&gt;.&lt;/p&gt;</content>
  282.      <author>
  283.  <name>Dennis Massie</name>
  284. <uri>https://hangingtogether.org/</uri>
  285.      </author>
  286.    </entry>
  287.  
  288.    <entry>
  289.      <title>Journal of Web Librarianship: Articulate 360</title>
  290.      <link rel="alternate" type="text/html" href="https://www.tandfonline.com/doi/full/10.1080/19322909.2025.2507882?ai=1dl&amp;mi=co84bk&amp;af=R"/>
  291.      <id>/doi/full/10.1080/19322909.2025.2507882?ai=1dl&amp;mi=co84bk&amp;af=R</id>
  292.      <updated>2025-07-01T03:43:13+00:00</updated>
  293.      <content type="html">. &lt;br /&gt;</content>
  294.      <author>
  295.  <name>Elizabeth Fridrick Laura A Sheets a Assistant Professor, Collections and Electronic Resources Librarian, Bowling Green State University, Bowling Green, Ohio, USAb Associate Professor, Instruction Services Librarian, Bowling Green State University, Bowling Green, Ohio, USA</name>
  296. <uri>https://www.tandfonline.com/loi/wjwl20?ai=1dl&mi=co84bk&af=R</uri>
  297.      </author>
  298.    </entry>
  299.  
  300.    <entry>
  301.      <title>Open Knowledge Foundation: Open Data Editor Introduced to Mongolian Data Enthusiasts</title>
  302.      <link rel="alternate" type="text/html" href="https://blog.okfn.org/2025/06/30/open-data-editor-introduced-to-mongolian-data-enthusiasts/"/>
  303.      <id>https://blog.okfn.org/?p=34225</id>
  304.      <updated>2025-06-30T20:34:59+00:00</updated>
  305.      <content type="html">&lt;p&gt;The training, offered by Public Lab Mongolia, attracted the interests of Mongolian open data enthusiasts and practitioners from various fields, including sociology students and practitioners, data analysts, health researchers, university lecturers, civil society, and private sector professionals.&lt;/p&gt;
  306. &lt;p&gt;The post &lt;a href=&quot;https://blog.okfn.org/2025/06/30/open-data-editor-introduced-to-mongolian-data-enthusiasts/&quot;&gt;Open Data Editor Introduced to Mongolian Data Enthusiasts&lt;/a&gt; first appeared on &lt;a href=&quot;https://blog.okfn.org&quot;&gt;Open Knowledge Blog&lt;/a&gt;.&lt;/p&gt;</content>
  307.      <author>
  308.  <name>Tsenguun Tumurkhuyag</name>
  309. <uri>https://blog.okfn.org</uri>
  310.      </author>
  311.    </entry>
  312.  
  313.    <entry>
  314.      <title>HangingTogether: Efficiënt ontdubbelen in WorldCat: hoe AI en catalogiseerwerk elkaar versterken</title>
  315.      <link rel="alternate" type="text/html" href="https://hangingtogether.org/efficient-ontdubbelen-in-worldcat-hoe-ai-en-catalogiseerwerk-elkaar-versterken/"/>
  316.      <id>https://hangingtogether.org/?p=16403</id>
  317.      <updated>2025-06-30T09:00:00+00:00</updated>
  318.      <content type="html">&lt;p class=&quot;has-small-font-size&quot;&gt;&lt;em&gt;In onze blogpost uit 2023, “&lt;a href=&quot;https://hangingtogether.org/machine-learning-and-worldcat-improving-records-for-cataloging-and-discovery/&quot;&gt;Machine Learning en WorldCat&lt;/a&gt;“, vertelden we voor het eerst hoe we machine learning inzetten om dubbele records in WorldCat op te sporen en samen te voegen.&lt;/em&gt;&lt;/p&gt;
  319.  
  320.  
  321.  
  322. &lt;p&gt;Het verwijderen van dubbele records is altijd belangrijk voor de kwaliteit van WorldCat. Het maakt catalogiseren efficiënter en verbetert de algehele kwaliteit. Nu bibliografische gegevens sneller dan ooit binnenstromen, moeten we ervoor zorgen dat records nauwkeurig, samenhangend en toegankelijk blijven. En dat in hoog tempo.&lt;/p&gt;
  323.  
  324.  
  325.  
  326. &lt;p&gt;Met AI kunnen we het ontdubbelen van gegevens snel en efficiënt opschalen. Toch blijft menselijke kennis en ervaring belangrijk voor het succes. Bij OCLC hebben we geïnvesteerd in een hybride aanpak: we gebruiken AI om enorme hoeveelheden data te verwerken, terwijl catalogiseerders en OCLC-experts de belangrijkste beslissingen blijven nemen.&lt;/p&gt;
  327.  
  328.  
  329.  
  330. &lt;h2 class=&quot;wp-block-heading&quot;&gt;Van papieren strookjes naar machine learning&lt;/h2&gt;
  331.  
  332.  
  333.  
  334. &lt;p&gt;Voordat ik bij OCLC kwam, werkte ik al aan het verbeteren van bibliografische gegevens. Het samenvoegen van dubbele records deden we toen nog helemaal handmatig. Bibliotheken stuurden ons papieren strookjes met mogelijke duplicaten, vaak voorzien van uitleg van een catalogiseerder.&lt;/p&gt;
  335.  
  336.  
  337.  
  338. &lt;p&gt;We sorteerden duizenden strookjes in archiefkasten: groene strookjes voor boeken, blauwe voor niet-boeken en roze voor series. De hoeveelheid strookjes was zo groot dat we zelfs kantoormeubilair moesten gebruiken om ze op te slaan. Je kon uiteindelijk nergens meer een pen of notitieblok vinden.&lt;/p&gt;
  339.  
  340.  
  341. &lt;div class=&quot;wp-block-image&quot;&gt;
  342. &lt;figure class=&quot;aligncenter size-full is-resized&quot;&gt;&lt;a href=&quot;https://hangingtogether.org/wp-content/uploads/2025/05/Image-1_Cabinets-with-labeling-slips.jpeg&quot;&gt;&lt;img alt=&quot;A cluttered office storage room filled with tall cabinets and shelves overflowing with stacks of colorful paper folders in pink, green, and yellow. The papers are piled on top of cabinets, spilling out of shelves, and scattered on the floor, creating a chaotic and disorganized environment. Bright fluorescent lighting illuminates the space, emphasizing the abundance of materials.&quot; class=&quot;wp-image-16227&quot; height=&quot;1024&quot; src=&quot;https://hangingtogether.org/wp-content/uploads/2025/05/Image-1_Cabinets-with-labeling-slips.jpeg&quot; style=&quot;width: 484px; height: auto;&quot; width=&quot;1024&quot; /&gt;&lt;/a&gt;&lt;em&gt;&lt;sup&gt;Deze afbeelding is gemaakt met AI en geeft een indruk van hoe de rommelige gangen eruitzagen waar we de duplicaatstrookjes bewaarden. Het ziet er hier veel netter uit dan het in werkelijkheid was.&lt;/sup&gt;&lt;/em&gt;&lt;/figure&gt;&lt;/div&gt;
  343.  
  344.  
  345. &lt;p&gt;Als ik erop terugkijk, zie ik hoe vooruitziend die gezamenlijke inspanning was. Het was langzaam en methodisch werk, maar het liet zien hoe zorgvuldig we toen te werk gingen. Elk strookje stond voor een beslissing, een stukje menselijk oordeel dat bepaalde of records in ons systeem werden samengevoegd of behouden. Ondanks ons harde werk konden we het nooit helemaal bijhouden. De stapel duplicaten bleef groeien en we liepen altijd achter de feiten aan.&lt;/p&gt;
  346.  
  347.  
  348.  
  349. &lt;p&gt;Het verschil met nu is enorm. Sinds ik bij OCLC werk met AI-gestuurde ontdubbeling, besef ik pas hoe veel efficiënter we dit tegenwoordig kunnen aanpakken. Wat vroeger jaren duurde, doen we nu in weken, met meer nauwkeurigheid en in meer talen, schriften en materiaaltypes dan ooit tevoren. Toch blijft de kern van het werk hetzelfde: menselijke expertise is onmisbaar. AI is geen wondermiddel. Het leert van onze catalogiseringsnormen, ons professionele oordeel en onze correcties.&lt;/p&gt;
  350.  
  351.  
  352.  
  353. &lt;p&gt;Door een hybride aanpak te gebruiken, waarbij machine learning het zware werk doet en menselijke controle het proces stuurt en verfijnt, kunnen we een balans vinden tussen snelheid en precisie. Zo bouwen we voort op het beste van beide werelden.&lt;/p&gt;
  354.  
  355.  
  356.  
  357. &lt;h2 class=&quot;wp-block-heading&quot;&gt;Innovatie en zorgvuldigheid in balans houden in WorldCat&lt;/h2&gt;
  358.  
  359.  
  360.  
  361. &lt;p&gt;Al tientallen jaren werken catalogiseerders, metadatabeheerders en OCLC-teams samen om de betrouwbaarheid van WorldCat te waarborgen. Zo blijft het een hoogwaardige, betrouwbare bron voor bibliotheken en onderzoekers. Het verwijderen van dubbele records is een belangrijk onderdeel van deze inspanning. Het zorgt ervoor dat alles overzichtelijker wordt, makkelijker doorzoekbaar is en beter uitwisselbaar is tussen verschillende systemen.&lt;/p&gt;
  362.  
  363.  
  364.  
  365. &lt;p&gt;Met AI kunnen we dubbele records nu op een nieuwe manier aanpakken. Hierdoor kunnen we veel meer duplicaten opsporen en samenvoegen dan ooit tevoren. De belangrijkste uitdaging is om AI verantwoord en transparant toe te passen, zodat het aansluit bij professionele catalogiseringsnormen.&lt;/p&gt;
  366.  
  367.  
  368.  
  369. &lt;p&gt;Deze schaalbare aanpak sluit naadloos aan bij onze langdurige rol als beheerders van gedeelde bibliografische gegevens. AI biedt ons de kans om menselijke expertise te versterken, zonder deze te vervangen.&lt;/p&gt;
  370.  
  371.  
  372.  
  373. &lt;h2 class=&quot;wp-block-heading&quot;&gt;Een nieuwe kijk op ontdubbeling&lt;/h2&gt;
  374.  
  375.  
  376.  
  377. &lt;p&gt;Tot nu toe gebruikten we vaste algoritmes en veel handwerk om dubbele records samen te voegen. Dit werkte wel, maar had duidelijke grenzen.&lt;/p&gt;
  378.  
  379.  
  380.  
  381. &lt;p&gt;Met de AI-gestuurde ontdubbelingsmethoden van OCLC kunnen we nu veel meer bereiken:&lt;/p&gt;
  382.  
  383.  
  384.  
  385. &lt;ul class=&quot;wp-block-list&quot;&gt;
  386. &lt;li&gt;&lt;strong&gt;Meer talen en schriften:&lt;/strong&gt; Ons machine learning-algoritme verwerkt efficiënt niet-Latijnse schriften en records in allerlei talen. Hierdoor kunnen we sneller duplicaten opsporen in wereldwijde collecties.&lt;/li&gt;
  387.  
  388.  
  389.  
  390. &lt;li&gt;&lt;strong&gt;Meer soorten records:&lt;/strong&gt; AI herkent duplicaten in een breder scala aan bibliografische gegevens. Ook helpt het ons bij materiaaltypes die eerder lastig waren om te ontdubbelen.&lt;/li&gt;
  391.  
  392.  
  393.  
  394. &lt;li&gt;&lt;strong&gt;Bescherming van zeldzame en speciale collecties:&lt;/strong&gt; Bij zeldzame en unieke materialen zetten we geen AI in. Zo blijven bijzondere items in archieven en speciale collecties goed beschermd.&lt;/li&gt;
  395. &lt;/ul&gt;
  396.  
  397.  
  398.  
  399. &lt;p&gt;Dankzij deze verbeteringen kunnen we de metadata van WorldCat verder verbeteren, met aandacht voor meer materialen en talen. Dit stelt ons in staat om de kwaliteit van data verantwoord te verhogen.&lt;/p&gt;
  400.  
  401.  
  402.  
  403. &lt;h2 class=&quot;wp-block-heading&quot;&gt;Wat “verantwoorde AI” in de praktijk betekent&lt;/h2&gt;
  404.  
  405.  
  406.  
  407. &lt;p&gt;De term ‘AI’ is breed en roept bij sommigen scepsis op. Dat is begrijpelijk. Verschillende AI-toepassingen brengen vragen met zich mee over vooroordelen, nauwkeurigheid en betrouwbaarheid.&lt;/p&gt;
  408.  
  409.  
  410.  
  411. &lt;p&gt;Onze aanpak is gebaseerd op een paar belangrijke principes:&lt;/p&gt;
  412.  
  413.  
  414.  
  415. &lt;ul class=&quot;wp-block-list&quot;&gt;
  416. &lt;li&gt;&lt;strong&gt;AI als aanvulling op menselijke expertise:&lt;/strong&gt; AI is bedoeld om mensen te ondersteunen, niet te vervangen. We hebben menselijke controle en het labelen van data ingebouwd, zodat onze AI-modellen leren volgens de beste catalogiseringspraktijken.&lt;/li&gt;
  417.  
  418.  
  419.  
  420. &lt;li&gt;&lt;strong&gt;Efficiëntie zonder concessies aan kwaliteit:&lt;/strong&gt; Onze AI is ontwikkeld om slim met computercapaciteit om te gaan, zonder dat dit ten koste gaat van de nauwkeurigheid en kwaliteit van de records.&lt;/li&gt;
  421.  
  422.  
  423.  
  424. &lt;li&gt;&lt;strong&gt;Duurzaamheid:&lt;/strong&gt; We zorgen ervoor dat onze systemen niet meer computerkracht gebruiken dan nodig. Zo blijven de resultaten goed, zonder verspilling. Door AI slim in te zetten, blijft ontdubbeling betaalbaar en toekomstbestendig, terwijl we blijven groeien.&lt;/li&gt;
  425. &lt;/ul&gt;
  426.  
  427.  
  428.  
  429. &lt;p&gt;Het doel is niet om mensen te vervangen, maar om hun kennis en tijd beter te benutten. Catalogiseerders kunnen zich daardoor richten op werk dat echt waarde toevoegt voor hun gebruikers, in plaats van eindeloos dubbele records op te ruimen.&lt;/p&gt;
  430.  
  431.  
  432.  
  433. &lt;p&gt;Daarnaast spelen catalogiseerders en onze ervaren OCLC-medewerkers een actieve rol in dit proces. Door gegevens te labelen en feedback te geven, helpen zij AI steeds beter te worden in het herkennen en verwerken van duplicaten.&lt;/p&gt;
  434.  
  435.  
  436.  
  437. &lt;h2 class=&quot;wp-block-heading&quot;&gt;AI als gezamenlijke inspanning en de weg vooruit&lt;/h2&gt;
  438.  
  439.  
  440.  
  441. &lt;p&gt;Ik mis de stapels papieren strookjes en het elk kwartaal opruimen van archiefkasten niet, maar ik waardeer wel wat ze symboliseerden: zorgvuldigheid en toewijding. AI vervangt die zorgvuldigheid niet, maar bouwt erop voort en tilt het naar een hoger niveau.&lt;/p&gt;
  442.  
  443.  
  444.  
  445. &lt;p&gt;Terwijl de tools zich blijven ontwikkelen, blijven onze principes hetzelfde. OCLC gebruikt al lange tijd technologie om bibliotheken te helpen bij het beheren van hun catalogi en collecties. Nu passen we diezelfde aanpak toe op AI: doelgericht, effectief en geworteld in onze gedeelde toewijding aan de kwaliteit van metadata.&lt;/p&gt;
  446.  
  447.  
  448.  
  449. &lt;p&gt;Deze manier van innoveren stelt bibliotheken in staat om te voldoen aan veranderende behoeften en waarde te blijven leveren aan hun gebruikers.&lt;/p&gt;
  450.  
  451.  
  452.  
  453. &lt;blockquote class=&quot;wp-block-quote has-background is-layout-flow wp-block-quote-is-layout-flow&quot; style=&quot;background-color: #fef0fe;&quot;&gt;
  454. &lt;p&gt;&lt;strong&gt;&lt;em&gt;Doe mee met OCLC’s datalabeling-initiatief en help de rol van AI bij het ontdubbelen&lt;/em&gt; &lt;em&gt;verbeteren&lt;/em&gt;&lt;/strong&gt;.&lt;/p&gt;
  455.  
  456.  
  457.  
  458. &lt;p&gt;&lt;em&gt;AI-gestuurde ontdubbeling is een gezamenlijke inspanning die voortdurend wordt verfijnd dankzij de input van de gemeenschap en professionele controle. Jouw bijdrage heeft direct invloed op de kwaliteit en efficiëntie van WorldCat. Daarmee komt het de hele bibliotheekgemeenschap ten goede.&lt;/em&gt;&lt;/p&gt;
  459.  
  460.  
  461.  
  462. &lt;p&gt;&lt;/p&gt;
  463.  
  464.  
  465.  
  466. &lt;p&gt;&lt;em&gt;Zo kun je deelnemen (informatie in het Engels):&lt;/em&gt;&lt;/p&gt;
  467.  
  468.  
  469.  
  470. &lt;ul class=&quot;wp-block-list&quot;&gt;
  471. &lt;li&gt;&lt;em&gt;&lt;a href=&quot;https://labelduplicates.worldcat.org/&quot;&gt;Interface voor datalabeling&lt;/a&gt; (WorldShare-login vereist)&lt;/em&gt;&lt;/li&gt;
  472.  
  473.  
  474.  
  475. &lt;li&gt;&lt;em&gt;&lt;a href=&quot;https://www.oclc.org/content/dam/oclc/worldcat/data-labeling-participation-instructions.pdf&quot;&gt;Deelname-instructies&lt;/a&gt; (pdf)&lt;/em&gt;&lt;/li&gt;
  476.  
  477.  
  478.  
  479. &lt;li&gt;&lt;em&gt;&lt;a href=&quot;https://www.oclc.org/content/dam/oclc/worldcat/data-labeling-faq.pdf&quot;&gt;Veelgestelde vragen&lt;/a&gt; (FAQs) (pdf)&lt;/em&gt;&lt;/li&gt;
  480. &lt;/ul&gt;
  481. &lt;/blockquote&gt;
  482.  
  483.  
  484.  
  485. &lt;p&gt;&lt;em&gt;Dit is een Nederlandse vertaling van de blogpost “&lt;a href=&quot;https://hangingtogether.org/scaling-de-duplication-in-worldcat-balancing-ai-innovation-with-cataloging-care/&quot;&gt;Scaling de-duplication in WorldCat: Balancing AI innovation with cataloging care&lt;/a&gt;“.&lt;/em&gt;&lt;/p&gt;
  486.  
  487.  
  488.  
  489. &lt;p&gt;&lt;/p&gt;
  490. &lt;p&gt;The post &lt;a href=&quot;https://hangingtogether.org/efficient-ontdubbelen-in-worldcat-hoe-ai-en-catalogiseerwerk-elkaar-versterken/&quot;&gt;Efficiënt ontdubbelen in WorldCat: hoe AI en catalogiseerwerk elkaar versterken&lt;/a&gt; appeared first on &lt;a href=&quot;https://hangingtogether.org&quot;&gt;Hanging Together&lt;/a&gt;.&lt;/p&gt;</content>
  491.      <author>
  492.  <name>Bemal Rajapatirana</name>
  493. <uri>https://hangingtogether.org/</uri>
  494.      </author>
  495.    </entry>
  496.  
  497.    <entry>
  498.      <title>HangingTogether: Deduplicación eficiente de WorldCat: Equilibrando la IA y la catalogación profesional</title>
  499.      <link rel="alternate" type="text/html" href="https://hangingtogether.org/deduplicacion-eficiente-de-worldcat-equilibrando-la-ia-y-la-catalogacion-profesional/"/>
  500.      <id>https://hangingtogether.org/?p=16364</id>
  501.      <updated>2025-06-30T09:00:00+00:00</updated>
  502.      <content type="html">&lt;p class=&quot;has-small-font-size&quot;&gt;&lt;em&gt;En la entrada de blog de agosto de 2023 titulada “&lt;/em&gt;&lt;a href=&quot;https://hangingtogether.org/machine-learning-and-worldcat-improving-records-for-cataloging-and-discovery/&quot;&gt;&lt;em&gt;Machine Learning y WorldCat&lt;/em&gt;&lt;/a&gt;&lt;em&gt;”, compartimos, por primera vez, nuestros esfuerzos para aprovechar el Aprendizaje Automático con el fin de mejorar la deduplicación en WorldCat.&lt;/em&gt;&lt;/p&gt;
  503.  
  504.  
  505.  
  506. &lt;p&gt;La deduplicación siempre ha sido un elemento central para garantizar la calidad de WorldCat, ya que mejora la eficiencia y la calidad de la catalogación. Sin embargo, con el aumento acelerado de datos bibliográficos, nos enfrentamos al desafío de mantener los registros actualizados, conectados y accesibles de forma rápida. La deduplicación impulsada por la IA ofrece una forma innovadora de agilizar este trabajo de manera eficiente, pero su éxito sigue dependiendo del conocimiento y criterio humano. En OCLC, hemos apostado por un enfoque híbrido que combina la capacidad de la IA para manejar grandes volúmenes de datos con el papel esencial de los catalogadores y de los expertos de OCLC a la hora de tomar las decisiones clave.&lt;/p&gt;
  507.  
  508.  
  509.  
  510. &lt;h2 class=&quot;wp-block-heading&quot;&gt;De las fichas en papel al Aprendizaje Automático&lt;/h2&gt;
  511.  
  512.  
  513.  
  514. &lt;p&gt;Antes de unirme a OCLC, trabajé en el área de calidad de datos bibliográficos en una época en la que la deduplicación se realizaba de manera completamente manual. Como parte de un programa de mejora de calidad, las bibliotecas nos enviaban fichas de papel detallando posibles duplicados, cada una acompañada de la explicación del catalogador en cuestión. Recuerdo que clasificábamos miles de estas fichas por colores: verde para libros, azul para materiales no bibliográficos, rosa para publicaciones seriadas. Incluso reutilizamos archivadores de la oficina para almacenar las fichas de duplicados que se nos desbordaban: encontrar bolígrafos y blocs de notas era prácticamente imposible.&lt;/p&gt;
  515.  
  516.  
  517. &lt;div class=&quot;wp-block-image&quot;&gt;
  518. &lt;figure class=&quot;aligncenter size-full is-resized&quot;&gt;&lt;a href=&quot;https://hangingtogether.org/wp-content/uploads/2025/05/Image-1_Cabinets-with-labeling-slips.jpeg&quot;&gt;&lt;img alt=&quot;A cluttered office storage room filled with tall cabinets and shelves overflowing with stacks of colorful paper folders in pink, green, and yellow. The papers are piled on top of cabinets, spilling out of shelves, and scattered on the floor, creating a chaotic and disorganized environment. Bright fluorescent lighting illuminates the space, emphasizing the abundance of materials.&quot; class=&quot;wp-image-16227&quot; height=&quot;1024&quot; src=&quot;https://hangingtogether.org/wp-content/uploads/2025/05/Image-1_Cabinets-with-labeling-slips.jpeg&quot; style=&quot;width: 484px; height: auto;&quot; width=&quot;1024&quot; /&gt;&lt;/a&gt;&lt;em&gt;&lt;/em&gt;&lt;em&gt;&lt;sup&gt;Esta imagen ha sido generada con IA para recrear mi recuerdo de los pasillos de archivadores abarrotados de fichas de duplicados. La IA lo ha recreado de manera más ordenada de lo que realmente era.&lt;/sup&gt;&lt;/em&gt;&lt;/figure&gt;&lt;/div&gt;
  519.  
  520.  
  521. &lt;p&gt;En retrospectiva, creo que fue un esfuerzo colaborativo con visión de futuro. Sin embargo, era un trabajo lento y meticuloso, que reflejaba la naturaleza minuciosa de nuestras tareas en aquel momento. Cada ficha representaba una decisión, un juicio humano que definía si los registros en nuestro sistema se fusionaban o permanecían separados. A pesar de todo el esfuerzo, este proceso estaba limitado por su volumen: siempre íbamos detrás de los duplicados en lugar de adelantarnos a ellos.&lt;/p&gt;
  522.  
  523.  
  524.  
  525. &lt;p&gt;Hoy, trabajando en la deduplicación impulsada por la IA en OCLC, me sorprende lo mucho que hemos avanzado. Lo que antes requería años ahora se completa en semanas, con mayor precisión y abarcando más idiomas, escrituras y tipos de materiales que nunca. Sin embargo, el núcleo del trabajo sigue siendo el mismo: el conocimiento humano es fundamental. La IA no es una solución mágica; aprende de nuestros estándares de catalogación, de nuestro criterio profesional y de nuestras correcciones.&lt;/p&gt;
  526.  
  527.  
  528.  
  529. &lt;p&gt;Al adoptar un enfoque híbrido para la deduplicación, podemos utilizar el Aprendizaje Automático para realizar el trabajo pesado mientras garantizamos que la supervisión humana guíe y refine el proceso.&lt;/p&gt;
  530.  
  531.  
  532.  
  533. &lt;h2 class=&quot;wp-block-heading&quot;&gt;Equilibrando&lt;strong&gt; &lt;/strong&gt;la innovación y la responsabilidad en WorldCat&lt;/h2&gt;
  534.  
  535.  
  536.  
  537. &lt;p&gt;Durante décadas, catalogadores, especialistas en metadatos y equipos de OCLC han trabajado para mantener la integridad de WorldCat, asegurando que sea un recurso fiable y de alta calidad para bibliotecas e investigadores.  La deduplicación siempre ha sido una pieza clave en este trabajo, eliminando registros duplicados para optimizar la eficiencia, facilitar el descubrimiento y mejorar la interoperabilidad.&lt;/p&gt;
  538.  
  539.  
  540.  
  541. &lt;p&gt;Ahora, la IA nos permite abordar la eliminación de duplicados de nuevas maneras, ampliando drásticamente nuestra capacidad para identificar y fusionar registros duplicados a gran escala. Sin embargo, la verdadera cuestión no es solo cómo aplicar la IA, sino cómo hacerlo de manera responsable, transparente y en consonancia con los estándares profesionales de catalogación.&lt;/p&gt;
  542.  
  543.  
  544.  
  545. &lt;p&gt;Este modelo para ampliar la eliminación de duplicados es una continuación de nuestra misión histórica de preservar y gestionar datos bibliográficos compartidos. La IA ofrece una oportunidad para potenciar el conocimiento humano, no para sustituirlo.&lt;/p&gt;
  546.  
  547.  
  548.  
  549. &lt;h2 class=&quot;wp-block-heading&quot;&gt;El giro fundamental en la deduplicación&lt;/h2&gt;
  550.  
  551.  
  552.  
  553. &lt;p&gt;Históricamente, la deduplicación se ha basado en algoritmos deterministas y en el esfuerzo manual de catalogadores y de los equipos de OCLC. Aunque estos métodos han sido efectivos, también presentan ciertas limitaciones.&lt;/p&gt;
  554.  
  555.  
  556.  
  557. &lt;p&gt;Las técnicas de deduplicación impulsadas por IA que hemos desarrollado en OCLC nos permiten:&lt;/p&gt;
  558.  
  559.  
  560.  
  561. &lt;ul class=&quot;wp-block-list&quot;&gt;
  562. &lt;li&gt;&lt;strong&gt;Ampliar el alcance más allá del inglés y las lenguas romances: &lt;/strong&gt;Nuestro algoritmo de Aprendizaje Automático procesa con mayor precisión y eficiencia escrituras no latinas y registros en todos los idiomas, lo que mejora la deduplicación rápida en colecciones globales.&lt;/li&gt;
  563.  
  564.  
  565.  
  566. &lt;li&gt;&lt;strong&gt;Abordar una mayor variedad de tipos de registros:&lt;/strong&gt; La IA facilita la identificación de duplicados en un espectro más amplio de registros bibliográficos y aporta nuevos conocimientos sobre materiales más complejos de tratar.&lt;/li&gt;
  567.  
  568.  
  569.  
  570. &lt;li&gt;&lt;strong&gt;Preservar colecciones raras y especiales:&lt;/strong&gt; Actualmente no aplicamos procesos de deduplicación impulsados por IA a materiales raros, garantizando así la preservación de los registros únicos de archivos y colecciones especiales.&lt;/li&gt;
  571. &lt;/ul&gt;
  572.  
  573.  
  574.  
  575. &lt;p&gt;Estos avances permiten mejorar la precisión de los metadatos en una variedad más amplia de materiales e idiomas, lo que nos ayuda a ampliar el trabajo que realizamos para garantizar la calidad de los metadatos de WorldCat de manera responsable.&lt;/p&gt;
  576.  
  577.  
  578.  
  579. &lt;h2 class=&quot;wp-block-heading&quot;&gt;Qué significa “IA responsable” en la práctica&lt;/h2&gt;
  580.  
  581.  
  582.  
  583. &lt;p&gt;El concepto de “IA” es amplio y, con frecuencia, genera escepticismo. Y no es para menos: muchas aplicaciones de Inteligencia Artificial plantean preocupaciones relacionadas con sesgos, precisión y fiabilidad.&lt;/p&gt;
  584.  
  585.  
  586.  
  587. &lt;p&gt;Nuestro enfoque se basa en algunas ideas clave:&lt;/p&gt;
  588.  
  589.  
  590.  
  591. &lt;ul class=&quot;wp-block-list&quot;&gt;
  592. &lt;li&gt;&lt;strong&gt;La IA debe complementar el conocimiento humano, no reemplazarlo.&lt;/strong&gt; Hemos integrado la revisión manual y el etiquetado de datos para garantizar que los modelos de IA se entrenen siguiendo las mejores prácticas de catalogación.&lt;/li&gt;
  593.  
  594.  
  595.  
  596. &lt;li&gt;&lt;strong style=&quot;color: initial;&quot;&gt;La eficiencia no debe comprometer la fiabilidad.&lt;/strong&gt;&lt;span style=&quot;color: initial;&quot;&gt; La deduplicación impulsada por IA está diseñada para optimizar los recursos computacionales, asegurando que la automatización no afecte la calidad de los registros.&lt;/span&gt;&lt;/li&gt;
  597.  
  598.  
  599.  
  600. &lt;li&gt;&lt;strong&gt;La sostenibilidad es fundamental.&lt;/strong&gt; Hemos desarrollado un sistema eficiente desde el punto de vista computacional, que reduce el uso innecesario de recursos mientras mantiene resultados de alta calidad. Al optimizar la huella tecnológica de la IA, garantizamos que la deduplicación sea rentable y viable a largo plazo.&lt;/li&gt;
  601. &lt;/ul&gt;
  602.  
  603.  
  604.  
  605. &lt;p&gt;Este modelo de deduplicación no busca reducir el papel de las personas, sino redirigir su conocimiento donde es más prioritario. Los catalogadores pueden enfocarse en tareas de alto valor que los conecten con sus comunidades, en lugar de dedicar tiempo a resolver registros duplicados.&lt;/p&gt;
  606.  
  607.  
  608.  
  609. &lt;p&gt;Además, los catalogadores y el equipo especializado de OCLC participan activamente en este proceso. A través del etiquetado de datos y valoraciones posteriores, los profesionales contribuyen a perfeccionar y mejorar la capacidad de la IA para identificar duplicados.&lt;/p&gt;
  610.  
  611.  
  612.  
  613. &lt;h2 class=&quot;wp-block-heading&quot;&gt;La IA como esfuerzo colaborativo y el camino por delante&lt;/h2&gt;
  614.  
  615.  
  616.  
  617. &lt;p&gt;No echo de menos las pilas de fichas ni las purgas trimestrales de archivadores, pero valoro profundamente lo que representaban. La IA no está sustituyendo ese cuidado, sino ampliándolo. Aunque las herramientas evolucionan, nuestros principios permanecen intactos. OCLC lleva años utilizando la tecnología para ayudar a las bibliotecas a gestionar sus catálogos y colecciones, y ahora aplicamos esa misma filosofía a la IA: de manera deliberada, efectiva y basada en nuestro compromiso compartido con la calidad de los metadatos. Este enfoque hacia la innovación permite a las bibliotecas adaptarse a necesidades cambiantes y ofrecer un mayor valor a sus usuarios.&lt;/p&gt;
  618.  
  619.  
  620.  
  621. &lt;blockquote class=&quot;wp-block-quote has-background is-layout-flow wp-block-quote-is-layout-flow&quot; style=&quot;background-color: #fef0fe;&quot;&gt;
  622. &lt;p&gt;&lt;strong&gt;&lt;em&gt;Participe en la iniciativa de etiquetado de datos de OCLC y contribuya al perfeccionamiento del rol de la IA en la deduplicación&lt;/em&gt;&lt;/strong&gt;.&lt;/p&gt;
  623.  
  624.  
  625.  
  626. &lt;p&gt;&lt;em&gt;La deduplicación impulsada por IA es un esfuerzo continuo y colaborativo que seguirá evolucionando gracias a las aportaciones de la comunidad y la supervisión profesional. Las contribuciones de los profesionales tendrán un impacto directo en la calidad y eficiencia de WorldCat, beneficiando a toda la comunidad bibliotecaria.&lt;/em&gt; &lt;/p&gt;
  627.  
  628.  
  629.  
  630. &lt;p&gt;&lt;/p&gt;
  631.  
  632.  
  633.  
  634. &lt;p&gt;&lt;em&gt;Cómo participar (información en Inglés):&lt;/em&gt;&lt;/p&gt;
  635.  
  636.  
  637.  
  638. &lt;ul class=&quot;wp-block-list&quot;&gt;
  639. &lt;li&gt;&lt;em&gt;&lt;a href=&quot;https://labelduplicates.worldcat.org/&quot;&gt;Interfaz de etiquetado de datos&lt;/a&gt; (se necesitan credenciales de WorldShare)&lt;/em&gt;&lt;/li&gt;
  640.  
  641.  
  642.  
  643. &lt;li&gt;&lt;em&gt;&lt;a href=&quot;https://www.oclc.org/content/dam/oclc/worldcat/data-labeling-participation-instructions.pdf&quot;&gt;Instrucciones de participación&lt;/a&gt; (pdf)&lt;/em&gt;&lt;/li&gt;
  644.  
  645.  
  646.  
  647. &lt;li&gt;&lt;em&gt;&lt;a href=&quot;https://www.oclc.org/content/dam/oclc/worldcat/data-labeling-faq.pdf&quot;&gt;FAQs&lt;/a&gt; (pdf)&lt;/em&gt;&lt;/li&gt;
  648. &lt;/ul&gt;
  649. &lt;/blockquote&gt;
  650.  
  651.  
  652.  
  653. &lt;p&gt;&lt;em&gt;Traducción al castellano de la entrada de blog “&lt;a href=&quot;https://hangingtogether.org/scaling-de-duplication-in-worldcat-balancing-ai-innovation-with-cataloging-care/&quot;&gt;Scaling de-duplication in WorldCat: Balancing AI innovation with cataloging care&lt;/a&gt;&lt;/em&gt;”.&lt;/p&gt;
  654.  
  655.  
  656.  
  657. &lt;p&gt;&lt;/p&gt;
  658. &lt;p&gt;The post &lt;a href=&quot;https://hangingtogether.org/deduplicacion-eficiente-de-worldcat-equilibrando-la-ia-y-la-catalogacion-profesional/&quot;&gt;Deduplicación eficiente de WorldCat: Equilibrando la IA y la catalogación profesional&lt;/a&gt; appeared first on &lt;a href=&quot;https://hangingtogether.org&quot;&gt;Hanging Together&lt;/a&gt;.&lt;/p&gt;</content>
  659.      <author>
  660.  <name>Bemal Rajapatirana</name>
  661. <uri>https://hangingtogether.org/</uri>
  662.      </author>
  663.    </entry>
  664.  
  665.    <entry>
  666.      <title>Web Science and Digital Libraries (WS-DL) Group at Old Dominion University: 2025-06-27: Paper Summary: MemoRAG: Moving towards Next-Gen RAG Via Memory-Inspired Knowledge Discovery</title>
  667.      <link rel="alternate" type="text/html" href="https://ws-dl.blogspot.com/2025/06/2025-06-27-paper-summary-memorag-moving.html"/>
  668.      <id>tag:blogger.com,1999:blog-953024975153422094.post-5523344179323239068</id>
  669.      <updated>2025-06-27T16:56:07+00:00</updated>
  670.      <content type="html">&lt;p&gt;&lt;br /&gt;&lt;/p&gt;&lt;span id=&quot;docs-internal-guid-9f8ae36f-7fff-07f7-e718-0a6a274f7da8&quot;&gt;&lt;p dir=&quot;ltr&quot; style=&quot;line-height: 1.38; margin-bottom: 12pt; margin-top: 12pt;&quot;&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;font-size: 11pt; vertical-align: baseline; white-space: pre-wrap;&quot;&gt; &lt;/span&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;font-size: 11pt; vertical-align: baseline; white-space: pre-wrap;&quot;&gt;&lt;span style=&quot;border: medium; display: inline-block; height: 428px; overflow: hidden; width: 624px;&quot;&gt;&lt;img height=&quot;428&quot; src=&quot;https://lh7-rt.googleusercontent.com/docsz/AD_4nXfzz7c47x0-C12qDAmcScmhcqAH9VGpbPZcDMJwOgjJ3ZSrEMMTJD9zfYRXP-rvL_JhaSq7fxuOloBUB1e4L9bNApX4e3aPnTU_H22yBKv5zS-1IsFkjVM2WF6KzMtAhF2lwTgKIw?key=zN872YrH0Td50__D7p4d4Cvz&quot; style=&quot;margin-left: 0px; margin-top: 0px;&quot; width=&quot;624&quot; /&gt;&lt;/span&gt;&lt;/span&gt;&lt;/p&gt;&lt;p dir=&quot;ltr&quot; style=&quot;line-height: 1.38; margin-bottom: 12pt; margin-top: 12pt; text-align: center;&quot;&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;font-size: 11pt; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;&quot;&gt;Comparison of Standard RAG systems and MemoRAG (&lt;/span&gt;&lt;a href=&quot;https://doi.org/10.48550/arXiv.2409.05591&quot; style=&quot;text-decoration: none;&quot;&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;color: #1155cc; font-size: 11pt; font-weight: 700; text-decoration: underline; vertical-align: baseline; white-space: pre-wrap;&quot;&gt;Qian et al.&lt;/span&gt;&lt;/a&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;font-size: 11pt; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;&quot;&gt;)&lt;/span&gt;&lt;/p&gt;&lt;p dir=&quot;ltr&quot; style=&quot;line-height: 1.38; margin-bottom: 12pt; margin-top: 12pt;&quot;&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;font-size: 11pt; vertical-align: baseline; white-space: pre-wrap;&quot;&gt;In my post “&lt;/span&gt;&lt;a href=&quot;https://ws-dl.blogspot.com/2024/01/2024-01-16-alice-ai-leveraged.html&quot; style=&quot;text-decoration: none;&quot;&gt;&lt;span face=&quot;Roboto, sans-serif&quot; style=&quot;color: #1155cc; font-size: 11pt; text-decoration: underline; vertical-align: baseline; white-space: pre-wrap;&quot;&gt;ALICE - AI Leveraged Information Capture and Exploration&lt;/span&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;color: black; font-size: 11pt; vertical-align: baseline; white-space: pre-wrap;&quot;&gt;”&lt;/span&gt;&lt;/a&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;font-size: 11pt; vertical-align: baseline; white-space: pre-wrap;&quot;&gt;, I proposed a system that unifies a Large Language Model (LLM) with a knowledge graph (KG) to archive routinely lost information from literature generating events. While assessing project risks, we have further researched hallucination and semantic sprawl mitigation strategies. I have been focusing on representation learning and embedding space methods to reduce the need for external knowledge bases such as in &lt;/span&gt;&lt;a href=&quot;https://en.wikipedia.org/wiki/Retrieval-augmented_generation&quot; style=&quot;text-decoration: none;&quot;&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;color: #1155cc; font-size: 11pt; text-decoration: underline; vertical-align: baseline; white-space: pre-wrap;&quot;&gt;Retrieval Augmented Generation&lt;/span&gt;&lt;/a&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;font-size: 11pt; vertical-align: baseline; white-space: pre-wrap;&quot;&gt; (RAG) methods. Along the way, I discovered similar current research. In this post, we review “&lt;/span&gt;&lt;a href=&quot;https://doi.org/10.48550/arXiv.2409.05591&quot; style=&quot;text-decoration: none;&quot;&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;color: #1155cc; font-size: 11pt; text-decoration: underline; vertical-align: baseline; white-space: pre-wrap;&quot;&gt;MemoRAG: Moving towards Next-Gen RAG Via Memory-Inspired Knowledge Discovery&lt;/span&gt;&lt;/a&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;font-size: 11pt; vertical-align: baseline; white-space: pre-wrap;&quot;&gt;”, a novel approach to RAG by Hongjin Qian, Peitian Zhang, and Zheng Liu from the Beijing Academy of Artificial Intelligence; and Kelong Mao and Zhicheng Dou from Renmin University of China published in the &lt;/span&gt;&lt;a href=&quot;https://www2025.thewebconf.org/&quot; style=&quot;text-decoration: none;&quot;&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;color: #1155cc; font-size: 11pt; text-decoration: underline; vertical-align: baseline; white-space: pre-wrap;&quot;&gt;ACM Web Conference 2025&lt;/span&gt;&lt;/a&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;font-size: 11pt; vertical-align: baseline; white-space: pre-wrap;&quot;&gt;. MemoRAG integrates memory mechanisms that will help bypass the limitations of the traditional RAG models. MemoRAG extends previous RAG systems by incorporating a novel dual memory model architecture which uses a lightweight memory model for maintaining long-term information and a heavy weight generator model that refines its output. We will discuss failures of the classical RAG systems, new memory components of MemoRAG, its performance across various benchmarks, and relative advantages on challenging reasoning, long-context summarization, and ambiguity handling.&lt;/span&gt;&lt;/p&gt;&lt;p dir=&quot;ltr&quot; style=&quot;line-height: 1.38; margin-bottom: 12pt; margin-top: 12pt;&quot;&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;font-size: 11pt; vertical-align: baseline; white-space: pre-wrap;&quot;&gt;&lt;br /&gt;&lt;/span&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;font-size: 11pt; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;&quot;&gt;Challenges of Traditional RAG Systems&lt;/span&gt;&lt;/p&gt;&lt;p dir=&quot;ltr&quot; style=&quot;line-height: 1.38; margin-bottom: 12pt; margin-top: 12pt;&quot;&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;font-size: 11pt; vertical-align: baseline; white-space: pre-wrap;&quot;&gt;Although large language models have revolutionized natural language processing, they still tend to express weaknesses in hallucinating information, maintaining long-term information, dealing with complex queries, and synthesizing unstructured sources of information. “&lt;/span&gt;&lt;a href=&quot;https://doi.org/10.48550/arXiv.2005.11401&quot; style=&quot;text-decoration: none;&quot;&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;color: #1155cc; font-size: 11pt; text-decoration: underline; vertical-align: baseline; white-space: pre-wrap;&quot;&gt;Retrieval-Augmented Generation for Knowledge-Intensive NLP Tasks”, Lewis et al. 2020&lt;/span&gt;&lt;/a&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;font-size: 11pt; vertical-align: baseline; white-space: pre-wrap;&quot;&gt; explains how RAG can bridge these gaps by allowing language models to leverage external databases in real time for improved accuracy and relevance . However, in “&lt;/span&gt;&lt;a href=&quot;https://aclanthology.org/2024.icnlsp-1.45/&quot; style=&quot;text-decoration: none;&quot;&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;color: #1155cc; font-size: 11pt; text-decoration: underline; vertical-align: baseline; white-space: pre-wrap;&quot;&gt;SG-RAG: Multi-Hop Question Answering with Large Language Models Through Knowledge Graphs 2024&lt;/span&gt;&lt;/a&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;font-size: 11pt; vertical-align: baseline; white-space: pre-wrap;&quot;&gt;”, Saleh et al. show that traditional RAG systems largely fail when there is a need to perform tasks that are more complex, requiring much deeper contextual understanding, ambiguity handling, and synthesis across multiple sources. &lt;/span&gt;&lt;/p&gt;&lt;p dir=&quot;ltr&quot; style=&quot;line-height: 1.38; margin-bottom: 12pt; margin-top: 12pt;&quot;&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;font-size: 11pt; vertical-align: baseline; white-space: pre-wrap;&quot;&gt;Traditional systems are based on a retriever model that fetches relevant information from a pre-indexed knowledge base and a generator model which generates responses in light of the retrieved context. This works when the task is simple, fact-based question-answering but fails when dealing with more complex information or implicit queries that would require synthesis from multiple documents or databases. For example, answering a question that would only require retrieval from a single source and pulled via lexical/semantic similarity — like “when did Apollo 11 land on the moon” would be well served by traditional RAG systems. But a question about how some book develops a theme might involve stitching together different narrative elements with information from multiple sources, which is beyond the capability of most classic RAG systems. Also, when working with unstructured datasets, traditional RAG systems often fail in performance owing to their orientation toward finding an exact answer rather than synthesizing loosely related information.&lt;/span&gt;&lt;/p&gt;&lt;ul style=&quot;margin-bottom: 0px; margin-top: 0px; padding-inline-start: 48px;&quot;&gt;&lt;li dir=&quot;ltr&quot; style=&quot;font-family: Arial, sans-serif; font-size: 11pt; font-weight: 700; vertical-align: baseline; white-space: pre;&quot;&gt;&lt;p dir=&quot;ltr&quot; style=&quot;line-height: 1.38; margin-bottom: 0pt; margin-top: 12pt;&quot;&gt;&lt;span style=&quot;font-size: 11pt; vertical-align: baseline; white-space: pre-wrap;&quot;&gt;Reliance on Explicit Queries: &lt;/span&gt;&lt;span style=&quot;font-size: 11pt; font-weight: 400; vertical-align: baseline; white-space: pre-wrap;&quot;&gt;Current RAG systems rely on simple, explicitly defined queries to access relevant data. They are intended to perform relevance matching between the input query and curated knowledge bases. But in practice, information requirements are often not clear or explicit, so the system must guess the user’s intention before loading the relevant content. Questions that require explicit reasoning, for example, or indirect citations (such as understanding themes in a text) are difficult to solve using standard RAG methods.&lt;/span&gt;&lt;/p&gt;&lt;/li&gt;&lt;li dir=&quot;ltr&quot; style=&quot;font-family: Arial, sans-serif; font-size: 11pt; font-weight: 700; vertical-align: baseline; white-space: pre;&quot;&gt;&lt;p dir=&quot;ltr&quot; style=&quot;line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;&quot;&gt;&lt;span style=&quot;font-size: 11pt; vertical-align: baseline; white-space: pre-wrap;&quot;&gt;Difficulty with Structured and Distributed Data: &lt;/span&gt;&lt;span style=&quot;font-size: 11pt; font-weight: 400; vertical-align: baseline; white-space: pre-wrap;&quot;&gt;Most RAG schemes are designed to work well with structured databases, where information can be pulled from it based on defined input parameters. But any task involving unstructured data (narratives, reports, etc.) is still an issue. Furthermore, when evidence stretches across multiple parts of a dataset, RAG models are typically incapable of bringing all this information together. Multi-hop reasoning, in which we must connect evidence to other points of data, is notoriously challenging for such systems. &lt;/span&gt;&lt;/p&gt;&lt;/li&gt;&lt;li dir=&quot;ltr&quot; style=&quot;font-family: Arial, sans-serif; font-size: 11pt; font-weight: 700; vertical-align: baseline; white-space: pre;&quot;&gt;&lt;p dir=&quot;ltr&quot; style=&quot;line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;&quot;&gt;&lt;span style=&quot;font-size: 11pt; vertical-align: baseline; white-space: pre-wrap;&quot;&gt;Context Window Limitations: &lt;/span&gt;&lt;span style=&quot;font-size: 11pt; font-weight: 400; vertical-align: baseline; white-space: pre-wrap;&quot;&gt;LLMs operate on a finite context window and cannot take on large historical interactions or large datasets. Even in the context of RAG systems, this restriction still remains because retrieval algorithms are unable to bridge the gap between long-context processing and short-context processing in any meaningful way. As a result, RAG systems might retrieve weak or partial evidence when performing an exercise that requires inclusion of wide spread information.&lt;/span&gt;&lt;/p&gt;&lt;/li&gt;&lt;li dir=&quot;ltr&quot; style=&quot;font-family: Arial, sans-serif; font-size: 11pt; font-weight: 700; vertical-align: baseline; white-space: pre;&quot;&gt;&lt;p dir=&quot;ltr&quot; style=&quot;line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;&quot;&gt;&lt;span style=&quot;font-size: 11pt; vertical-align: baseline; white-space: pre-wrap;&quot;&gt;Challenges in Query Refinement: &lt;/span&gt;&lt;span style=&quot;font-size: 11pt; font-weight: 400; vertical-align: baseline; white-space: pre-wrap;&quot;&gt;One of the problems of RAG systems is that they do not properly sanitize queries for more optimal retrieval outcomes. When the input from the user is ambiguous, standard RAG methods lack tools to translate the query into a practical query. This inconsistency often leads to obfuscated or incomplete fetching which negatively affects the final output.&lt;/span&gt;&lt;/p&gt;&lt;/li&gt;&lt;li dir=&quot;ltr&quot; style=&quot;font-family: Arial, sans-serif; font-size: 11pt; font-weight: 700; vertical-align: baseline; white-space: pre;&quot;&gt;&lt;p dir=&quot;ltr&quot; style=&quot;line-height: 1.38; margin-bottom: 12pt; margin-top: 0pt;&quot;&gt;&lt;span style=&quot;font-size: 11pt; vertical-align: baseline; white-space: pre-wrap;&quot;&gt;Limited Applicability Beyond Straightforward Tasks: &lt;/span&gt;&lt;span style=&quot;font-size: 11pt; font-weight: 400; vertical-align: baseline; white-space: pre-wrap;&quot;&gt;Standard RAG systems are best suited to tasks such as question answering or simple summarization, for which knowledge retrieval is direct. But they are unable to perform well in more difficult situations like pooling scattered evidence across documents, understanding abstract or complicated questions (e.g., thematic analysis) and working in domain-specific contexts like legal or financial records.&lt;/span&gt;&lt;/p&gt;&lt;/li&gt;&lt;/ul&gt;&lt;p dir=&quot;ltr&quot; style=&quot;line-height: 1.38; margin-bottom: 12pt; margin-top: 12pt;&quot;&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;font-size: 11pt; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;&quot;&gt;MemoRAG’s solution to Traditional RAG challenges&lt;/span&gt;&lt;/p&gt;&lt;p dir=&quot;ltr&quot; style=&quot;background-color: white; line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;&quot;&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;color: #3b3a3b; font-size: 10.5pt; vertical-align: baseline; white-space: pre-wrap;&quot;&gt;MemoRAG addresses these challenges through its memory-inspired architecture. The central novelty in MemoRAG is a memory model that enables long-term information storage and retrieval. While previous RAG systems had been based on query-based usage of databases for every different task, MemoRAG proposed a twin-system architecture for memory integration. This architecture includes:&lt;/span&gt;&lt;/p&gt;&lt;p dir=&quot;ltr&quot; style=&quot;background-color: white; line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;&quot;&gt;&lt;br /&gt;&lt;/p&gt;&lt;ul style=&quot;margin-bottom: 0px; margin-top: 0px; padding-inline-start: 48px;&quot;&gt;&lt;li dir=&quot;ltr&quot; style=&quot;color: #3b3a3b; font-family: Arial, sans-serif; font-size: 10.5pt; vertical-align: baseline; white-space: pre;&quot;&gt;&lt;p dir=&quot;ltr&quot; style=&quot;background-color: white; line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;&quot;&gt;&lt;span style=&quot;font-size: 10.5pt; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;&quot;&gt;Lightweight Memory Model-Long-Context LLM&lt;/span&gt;&lt;span style=&quot;font-size: 10.5pt; vertical-align: baseline; white-space: pre-wrap;&quot;&gt;: It compresses large data in memory tokens representing the whole database and providing clues for answering elaborate queries. Essentially, it acts like a summarizer that builds guideposts for the retrieval process.&lt;/span&gt;&lt;/p&gt;&lt;/li&gt;&lt;/ul&gt;&lt;p dir=&quot;ltr&quot; style=&quot;background-color: white; line-height: 1.38; margin-bottom: 0pt; margin-left: 36pt; margin-top: 0pt;&quot;&gt;&lt;br /&gt;&lt;/p&gt;&lt;ul style=&quot;margin-bottom: 0px; margin-top: 0px; padding-inline-start: 48px;&quot;&gt;&lt;li dir=&quot;ltr&quot; style=&quot;color: #3b3a3b; font-family: Arial, sans-serif; font-size: 10.5pt; vertical-align: baseline; white-space: pre;&quot;&gt;&lt;p dir=&quot;ltr&quot; style=&quot;background-color: white; line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;&quot;&gt;&lt;span style=&quot;font-size: 10.5pt; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;&quot;&gt;LLM-heavy Generator Model&lt;/span&gt;&lt;span style=&quot;font-size: 10.5pt; vertical-align: baseline; white-space: pre-wrap;&quot;&gt;: It fine-tunes the information retrieved and produces highly detailed responses in a coherent manner. It does exceptionally well on ambiguous queries, multi-hop reasoning, and summary insights with complexity.&lt;/span&gt;&lt;/p&gt;&lt;/li&gt;&lt;/ul&gt;&lt;p dir=&quot;ltr&quot; style=&quot;background-color: white; line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt; padding: 0pt 0pt 15pt;&quot;&gt;&lt;br /&gt;&lt;/p&gt;&lt;p dir=&quot;ltr&quot; style=&quot;background-color: white; line-height: 1.38; margin-bottom: 15pt; margin-top: 0pt;&quot;&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;color: #3b3a3b; font-size: 10.5pt; vertical-align: baseline; white-space: pre-wrap;&quot;&gt;Using this architecture MemoRAG, operates in three main stages: &lt;/span&gt;&lt;/p&gt;&lt;ul style=&quot;margin-bottom: 0px; margin-top: 0px; padding-inline-start: 48px;&quot;&gt;&lt;li dir=&quot;ltr&quot; style=&quot;color: #3b3a3b; font-family: Arial, sans-serif; font-size: 10.5pt; vertical-align: baseline; white-space: pre;&quot;&gt;&lt;p dir=&quot;ltr&quot; style=&quot;background-color: white; line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;&quot;&gt;&lt;span style=&quot;font-size: 10.5pt; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;&quot;&gt;Memory Formation&lt;/span&gt;&lt;span style=&quot;font-size: 10.5pt; vertical-align: baseline; white-space: pre-wrap;&quot;&gt;: Memory model forms a truncated but semantic version of the database. The memory module takes raw input tokens (say, database entries) and compiles them as compressed memory tokens without discarding any semantic information. The mechanism for this is an attentional mechanism of transformers. For example: &lt;/span&gt;&lt;/p&gt;&lt;/li&gt;&lt;ul style=&quot;margin-bottom: 0px; margin-top: 0px; padding-inline-start: 48px;&quot;&gt;&lt;li dir=&quot;ltr&quot; style=&quot;color: #3b3a3b; font-family: Arial, sans-serif; font-size: 10.5pt; vertical-align: baseline; white-space: pre;&quot;&gt;&lt;p dir=&quot;ltr&quot; style=&quot;background-color: white; line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;&quot;&gt;&lt;span style=&quot;font-size: 10.5pt; vertical-align: baseline; white-space: pre-wrap;&quot;&gt;The input tokens are passed over a series of transformer layers and context is taken in. &lt;/span&gt;&lt;/p&gt;&lt;/li&gt;&lt;li dir=&quot;ltr&quot; style=&quot;color: #3b3a3b; font-family: Arial, sans-serif; font-size: 10.5pt; vertical-align: baseline; white-space: pre;&quot;&gt;&lt;p dir=&quot;ltr&quot; style=&quot;background-color: white; line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;&quot;&gt;&lt;span style=&quot;font-size: 10.5pt; vertical-align: baseline; white-space: pre-wrap;&quot;&gt;Memory tokens are introduced as long-term knowledge stores which it deciphers and encodes high-level semantic data of the input tokens.&lt;/span&gt;&lt;/p&gt;&lt;/li&gt;&lt;/ul&gt;&lt;li dir=&quot;ltr&quot; style=&quot;color: #3b3a3b; font-family: Arial, sans-serif; font-size: 10.5pt; vertical-align: baseline; white-space: pre;&quot;&gt;&lt;p dir=&quot;ltr&quot; style=&quot;background-color: white; line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;&quot;&gt;&lt;span style=&quot;font-size: 10.5pt; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;&quot;&gt;Clue Generation&lt;/span&gt;&lt;span style=&quot;font-size: 10.5pt; vertical-align: baseline; white-space: pre-wrap;&quot;&gt;: Depending on the query entered, the memory model produces &quot;clues&quot; intermediate to it, in other words, handwritten solutions to the query that will inform the retrieve. With a continuous append of memory tokens and the elimination of less meaningful data from the input string, the memory module gradually shrinks long inputs into an efficient and concise memory image. The process works similarly to how human memory works – in which short-term information is reduced to long-term memories.&lt;/span&gt;&lt;/p&gt;&lt;/li&gt;&lt;ul style=&quot;margin-bottom: 0px; margin-top: 0px; padding-inline-start: 48px;&quot;&gt;&lt;li dir=&quot;ltr&quot; style=&quot;color: #3b3a3b; font-family: Arial, sans-serif; font-size: 10.5pt; vertical-align: baseline; white-space: pre;&quot;&gt;&lt;p dir=&quot;ltr&quot; style=&quot;background-color: white; line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;&quot;&gt;&lt;span style=&quot;font-size: 10.5pt; vertical-align: baseline; white-space: pre-wrap;&quot;&gt;Example: &lt;/span&gt;&lt;span style=&quot;color: #0e0e0e; font-size: 10.5pt; vertical-align: baseline; white-space: pre-wrap;&quot;&gt;A query like &lt;/span&gt;&lt;span style=&quot;color: #0e0e0e; font-size: 10.5pt; font-style: italic; vertical-align: baseline; white-space: pre-wrap;&quot;&gt;“How does the report discuss financial trends?”&lt;/span&gt;&lt;span style=&quot;color: #0e0e0e; font-size: 10.5pt; vertical-align: baseline; white-space: pre-wrap;&quot;&gt; might yield clues such as &lt;/span&gt;&lt;span style=&quot;color: #0e0e0e; font-size: 10.5pt; font-style: italic; vertical-align: baseline; white-space: pre-wrap;&quot;&gt;“Identify revenue figures from recent years”&lt;/span&gt;&lt;span style=&quot;color: #0e0e0e; font-size: 10.5pt; vertical-align: baseline; white-space: pre-wrap;&quot;&gt; or &lt;/span&gt;&lt;span style=&quot;color: #0e0e0e; font-size: 10.5pt; font-style: italic; vertical-align: baseline; white-space: pre-wrap;&quot;&gt;“Locate sections discussing market growth.”&lt;/span&gt;&lt;/p&gt;&lt;/li&gt;&lt;/ul&gt;&lt;li dir=&quot;ltr&quot; style=&quot;color: #3b3a3b; font-family: Arial, sans-serif; font-size: 10.5pt; vertical-align: baseline; white-space: pre;&quot;&gt;&lt;p dir=&quot;ltr&quot; style=&quot;background-color: white; line-height: 1.38; margin-bottom: 15pt; margin-top: 0pt;&quot;&gt;&lt;span style=&quot;font-size: 10.5pt; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;&quot;&gt;Evidence retrieval and Answer Generation&lt;/span&gt;&lt;span style=&quot;font-size: 10.5pt; vertical-align: baseline; white-space: pre-wrap;&quot;&gt;: Based on clues, relevant evidence is pulled from the database and the generator model generates the final result. Memory tokens are used as a mediator between input data and output. When asked a question, the memory module returns cues that tell you what you can expect the answer to be and how to retrieve evidence.&lt;/span&gt;&lt;/p&gt;&lt;/li&gt;&lt;/ul&gt;&lt;p dir=&quot;ltr&quot; style=&quot;background-color: white; line-height: 1.38; margin-bottom: 15pt; margin-top: 0pt;&quot;&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;color: #3b3a3b; font-size: 10.5pt; vertical-align: baseline; white-space: pre-wrap;&quot;&gt;MemoRAG’s memory architecture enables it to synthesize information from disparate sources, bridging the gap where the traditional RAG systems falter. This contributes to the following benefits over traditional RAG systems:&lt;/span&gt;&lt;/p&gt;&lt;ul style=&quot;margin-bottom: 0px; margin-top: 0px; padding-inline-start: 48px;&quot;&gt;&lt;li dir=&quot;ltr&quot; style=&quot;color: #3b3a3b; font-family: Arial, sans-serif; font-size: 10.5pt; vertical-align: baseline; white-space: pre;&quot;&gt;&lt;p dir=&quot;ltr&quot; style=&quot;background-color: white; line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;&quot;&gt;&lt;span style=&quot;font-size: 10.5pt; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;&quot;&gt;Long-Term Memory&lt;/span&gt;&lt;span style=&quot;font-size: 10.5pt; vertical-align: baseline; white-space: pre-wrap;&quot;&gt;: MemoRAG achieves performance beyond the traditional limitations of finite context windows by using a memory model that can compress large data into an accessible representation.&lt;/span&gt;&lt;/p&gt;&lt;/li&gt;&lt;li dir=&quot;ltr&quot; style=&quot;color: #3b3a3b; font-family: Arial, sans-serif; font-size: 10.5pt; vertical-align: baseline; white-space: pre;&quot;&gt;&lt;p dir=&quot;ltr&quot; style=&quot;background-color: white; line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;&quot;&gt;&lt;span style=&quot;font-size: 10.5pt; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;&quot;&gt;Evidence-Based Retrieval&lt;/span&gt;&lt;span style=&quot;font-size: 10.5pt; vertical-align: baseline; white-space: pre-wrap;&quot;&gt;: Clues provided by the system help to steer the retrieval, bridging the gap between a vague query and evidence. &lt;/span&gt;&lt;/p&gt;&lt;/li&gt;&lt;li dir=&quot;ltr&quot; style=&quot;color: #3b3a3b; font-family: Arial, sans-serif; font-size: 10.5pt; vertical-align: baseline; white-space: pre;&quot;&gt;&lt;p dir=&quot;ltr&quot; style=&quot;background-color: white; line-height: 1.38; margin-bottom: 15pt; margin-top: 0pt;&quot;&gt;&lt;span style=&quot;font-size: 10.5pt; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;&quot;&gt;Dual-System Design&lt;/span&gt;&lt;span style=&quot;font-size: 10.5pt; vertical-align: baseline; white-space: pre-wrap;&quot;&gt;: Dividing the memory and generation functions between separate models makes MemoRAG free up computational space while maintaining accuracy.&lt;/span&gt;&lt;/p&gt;&lt;/li&gt;&lt;/ul&gt;&lt;p dir=&quot;ltr&quot; style=&quot;background-color: white; line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt; padding: 0pt 0pt 15pt;&quot;&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;color: #3b3a3b; font-size: 10.5pt; vertical-align: baseline; white-space: pre-wrap;&quot;&gt;&lt;span style=&quot;border: medium; display: inline-block; height: 239px; overflow: hidden; width: 624px;&quot;&gt;&lt;img height=&quot;239&quot; src=&quot;https://lh7-rt.googleusercontent.com/docsz/AD_4nXfFdqkDfunG21MJ9pRTJFtD845hFdD88_2jp1FQ9_FWNTomr5Fixs5zfR_pDmde8B3NGQhn77Vmv72hDbzC86LJvNDVu-GwePGt7FtQvFY7q1XrFTY99YPntBgdvY4NIEqXHMgpaQ?key=zN872YrH0Td50__D7p4d4Cvz&quot; style=&quot;margin-left: 0px; margin-top: 0px;&quot; width=&quot;624&quot; /&gt;&lt;/span&gt;&lt;/span&gt;&lt;/p&gt;&lt;p dir=&quot;ltr&quot; style=&quot;background-color: white; line-height: 1.38; margin-bottom: 15pt; margin-top: 0pt; text-align: center;&quot;&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;color: #3b3a3b; font-size: 10.5pt; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;&quot;&gt;Example of Implicit Query Workflow (&lt;/span&gt;&lt;a href=&quot;https://doi.org/10.48550/arXiv.2409.05591&quot; style=&quot;text-decoration: none;&quot;&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;color: #1155cc; font-size: 10.5pt; font-weight: 700; text-decoration: underline; vertical-align: baseline; white-space: pre-wrap;&quot;&gt;Qian et al&lt;/span&gt;&lt;/a&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;color: #3b3a3b; font-size: 10.5pt; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;&quot;&gt;.)&lt;/span&gt;&lt;/p&gt;&lt;p dir=&quot;ltr&quot; style=&quot;line-height: 1.38; margin-bottom: 12pt; margin-top: 12pt;&quot;&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;font-size: 11pt; vertical-align: baseline; white-space: pre-wrap;&quot;&gt;To illustrate MemoRAG’s advantages, the paper compares MemoRAG with standard RAG by asking the question: &lt;/span&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;font-size: 11pt; font-style: italic; vertical-align: baseline; white-space: pre-wrap;&quot;&gt;“How the book convey the theme of love?”&lt;/span&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;font-size: 11pt; vertical-align: baseline; white-space: pre-wrap;&quot;&gt; (referring to the Harry Potter series). Standard RAG fails to extract these implied relationships due to scattered evidence and poor formulation in the question. MemoRAG, however: &lt;/span&gt;&lt;/p&gt;&lt;ul style=&quot;margin-bottom: 0px; margin-top: 0px; padding-inline-start: 48px;&quot;&gt;&lt;li dir=&quot;ltr&quot; style=&quot;font-family: Arial, sans-serif; font-size: 11pt; vertical-align: baseline; white-space: pre;&quot;&gt;&lt;p dir=&quot;ltr&quot; style=&quot;line-height: 1.38; margin-bottom: 0pt; margin-top: 12pt;&quot;&gt;&lt;span style=&quot;font-size: 11pt; vertical-align: baseline; white-space: pre-wrap;&quot;&gt;Creates an internal memory of the book. &lt;/span&gt;&lt;/p&gt;&lt;/li&gt;&lt;li dir=&quot;ltr&quot; style=&quot;font-family: Arial, sans-serif; font-size: 11pt; vertical-align: baseline; white-space: pre;&quot;&gt;&lt;p dir=&quot;ltr&quot; style=&quot;line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;&quot;&gt;&lt;span style=&quot;font-size: 11pt; vertical-align: baseline; white-space: pre-wrap;&quot;&gt;Provides placeholder clues, for example defining relationships between characters. &lt;/span&gt;&lt;/p&gt;&lt;/li&gt;&lt;li dir=&quot;ltr&quot; style=&quot;font-family: Arial, sans-serif; font-size: 11pt; vertical-align: baseline; white-space: pre;&quot;&gt;&lt;p dir=&quot;ltr&quot; style=&quot;line-height: 1.38; margin-bottom: 12pt; margin-top: 0pt;&quot;&gt;&lt;span style=&quot;font-size: 11pt; vertical-align: baseline; white-space: pre-wrap;&quot;&gt;Digs through the clues to pull out pertinent elements and builds a complete and precise solution. This is a way to demonstrate MemoRAG’s capability to handle implicit queries and distributed evidence effectively.&lt;/span&gt;&lt;/p&gt;&lt;/li&gt;&lt;/ul&gt;&lt;p dir=&quot;ltr&quot; style=&quot;line-height: 1.38; margin-bottom: 12pt; margin-top: 12pt;&quot;&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;font-size: 11pt; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;&quot;&gt;Training the Memory Module&lt;/span&gt;&lt;/p&gt;&lt;p dir=&quot;ltr&quot; style=&quot;line-height: 1.38; margin-bottom: 12pt; margin-top: 12pt;&quot;&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;font-size: 11pt; vertical-align: baseline; white-space: pre-wrap;&quot;&gt;The memory module must be trained at two different phases to ensure its efficacy: &lt;/span&gt;&lt;/p&gt;&lt;ul style=&quot;margin-bottom: 0px; margin-top: 0px; padding-inline-start: 48px;&quot;&gt;&lt;li dir=&quot;ltr&quot; style=&quot;font-family: Arial, sans-serif; font-size: 11pt; vertical-align: baseline; white-space: pre;&quot;&gt;&lt;p dir=&quot;ltr&quot; style=&quot;line-height: 1.38; margin-bottom: 0pt; margin-top: 12pt;&quot;&gt;&lt;span style=&quot;font-size: 11pt; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;&quot;&gt;Pre-Training&lt;/span&gt;&lt;span style=&quot;font-size: 11pt; vertical-align: baseline; white-space: pre-wrap;&quot;&gt;: The memory module gets long contexts in the pre-training phase from various datasets (e.g., novel, papers, reports). That allows the model to learn how to squeeze and store semantics important elements in very long inputs. The &lt;/span&gt;&lt;a href=&quot;https://www.together.ai/blog/redpajama-data-v2&quot; style=&quot;text-decoration: none;&quot;&gt;&lt;span style=&quot;color: #1155cc; font-size: 11pt; text-decoration: underline; vertical-align: baseline; white-space: pre-wrap;&quot;&gt;RedPajama dataset&lt;/span&gt;&lt;/a&gt;&lt;span style=&quot;font-size: 11pt; vertical-align: baseline; white-space: pre-wrap;&quot;&gt;, a large set of quality texts, was extensively used to do this.&lt;/span&gt;&lt;/p&gt;&lt;/li&gt;&lt;li dir=&quot;ltr&quot; style=&quot;font-family: Arial, sans-serif; font-size: 11pt; vertical-align: baseline; white-space: pre;&quot;&gt;&lt;p dir=&quot;ltr&quot; style=&quot;line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;&quot;&gt;&lt;span style=&quot;font-size: 11pt; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;&quot;&gt;Supervised Fine-Tuning (SFT)&lt;/span&gt;&lt;span style=&quot;font-size: 11pt; vertical-align: baseline; white-space: pre-wrap;&quot;&gt;: At the fine-tuning phase, the memory module is trained on task data in order to get the most clues possible. This entails: &lt;/span&gt;&lt;/p&gt;&lt;/li&gt;&lt;ul style=&quot;margin-bottom: 0px; margin-top: 0px; padding-inline-start: 48px;&quot;&gt;&lt;li dir=&quot;ltr&quot; style=&quot;font-family: Arial, sans-serif; font-size: 11pt; vertical-align: baseline; white-space: pre;&quot;&gt;&lt;p dir=&quot;ltr&quot; style=&quot;line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;&quot;&gt;&lt;span style=&quot;font-size: 11pt; vertical-align: baseline; white-space: pre-wrap;&quot;&gt;To give the model a question, long context, and an answer. &lt;/span&gt;&lt;/p&gt;&lt;/li&gt;&lt;li dir=&quot;ltr&quot; style=&quot;font-family: Arial, sans-serif; font-size: 11pt; vertical-align: baseline; white-space: pre;&quot;&gt;&lt;p dir=&quot;ltr&quot; style=&quot;line-height: 1.38; margin-bottom: 12pt; margin-top: 0pt;&quot;&gt;&lt;span style=&quot;font-size: 11pt; vertical-align: baseline; white-space: pre-wrap;&quot;&gt;Programming it to output intermediate hints between the query and the long context. &lt;/span&gt;&lt;/p&gt;&lt;/li&gt;&lt;/ul&gt;&lt;/ul&gt;&lt;p dir=&quot;ltr&quot; style=&quot;line-height: 1.38; margin-bottom: 12pt; margin-top: 12pt;&quot;&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;font-size: 11pt; vertical-align: baseline; white-space: pre-wrap;&quot;&gt;The authors designed 17,116 supervised fine-tuning samples to improve MemoRAG’s response in questions answering and summarization tasks.&lt;/span&gt;&lt;/p&gt;&lt;p dir=&quot;ltr&quot; style=&quot;line-height: 1.38; margin-bottom: 12pt; margin-top: 12pt;&quot;&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;font-size: 11pt; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;&quot;&gt;Performance and Benchmark Evaluation&lt;/span&gt;&lt;/p&gt;&lt;p dir=&quot;ltr&quot; style=&quot;line-height: 1.38; margin-bottom: 12pt; margin-top: 12pt;&quot;&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;background-color: white; font-size: 11pt; vertical-align: baseline; white-space: pre-wrap;&quot;&gt;MemoRAG was tested against two benchmarks:&lt;/span&gt;&lt;/p&gt;&lt;ul style=&quot;margin-bottom: 0px; margin-top: 0px; padding-inline-start: 48px;&quot;&gt;&lt;li dir=&quot;ltr&quot; style=&quot;font-family: Arial, sans-serif; font-size: 11pt; vertical-align: baseline; white-space: pre;&quot;&gt;&lt;p dir=&quot;ltr&quot; style=&quot;line-height: 1.38; margin-bottom: 0pt; margin-top: 12pt;&quot;&gt;&lt;span style=&quot;background-color: white; font-size: 11pt; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;&quot;&gt;Standard Benchmarks&lt;/span&gt;&lt;span style=&quot;background-color: white; font-size: 11pt; vertical-align: baseline; white-space: pre-wrap;&quot;&gt;: Applied to datasets including &lt;/span&gt;&lt;a href=&quot;https://paperswithcode.com/dataset/narrativeqa&quot; style=&quot;text-decoration: none;&quot;&gt;&lt;span style=&quot;background-color: white; color: #1155cc; font-size: 11pt; text-decoration: underline; vertical-align: baseline; white-space: pre-wrap;&quot;&gt;NarrativeQA&lt;/span&gt;&lt;/a&gt;&lt;span style=&quot;background-color: white; font-size: 11pt; vertical-align: baseline; white-space: pre-wrap;&quot;&gt; (questions-asking), &lt;/span&gt;&lt;a href=&quot;https://hotpotqa.github.io/&quot; style=&quot;text-decoration: none;&quot;&gt;&lt;span style=&quot;background-color: white; color: #1155cc; font-size: 11pt; text-decoration: underline; vertical-align: baseline; white-space: pre-wrap;&quot;&gt;HotpotQA&lt;/span&gt;&lt;/a&gt;&lt;span style=&quot;background-color: white; font-size: 11pt; vertical-align: baseline; white-space: pre-wrap;&quot;&gt; (multi-hop reasoning), and &lt;/span&gt;&lt;a href=&quot;https://paperswithcode.com/dataset/multi-news&quot; style=&quot;text-decoration: none;&quot;&gt;&lt;span style=&quot;background-color: white; color: #1155cc; font-size: 11pt; text-decoration: underline; vertical-align: baseline; white-space: pre-wrap;&quot;&gt;MultiNews&lt;/span&gt;&lt;/a&gt;&lt;span style=&quot;background-color: white; font-size: 11pt; vertical-align: baseline; white-space: pre-wrap;&quot;&gt; (summaries of multiple documents).&lt;/span&gt;&lt;/p&gt;&lt;/li&gt;&lt;li dir=&quot;ltr&quot; style=&quot;font-family: Arial, sans-serif; font-size: 11pt; vertical-align: baseline; white-space: pre;&quot;&gt;&lt;p dir=&quot;ltr&quot; style=&quot;line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;&quot;&gt;&lt;a href=&quot;https://github.com/qhjqhj00/MemoRAG&quot; style=&quot;text-decoration: none;&quot;&gt;&lt;span style=&quot;color: #1155cc; font-size: 11pt; font-weight: 700; text-decoration: underline; vertical-align: baseline; white-space: pre-wrap;&quot;&gt;ULTRADOMAIN&lt;/span&gt;&lt;/a&gt;&lt;span style=&quot;font-size: 11pt; vertical-align: baseline; white-space: pre-wrap;&quot;&gt;, designed by the authors to test the ability of LLMs to handle long-context tasks across diverse domains like the Law, Finance, and Education. The evaluation tasks included both in-domain, closer to that provided for training the model and out-of-domain, meaning the contextual presentation involved new and unfamiliar settings.&lt;/span&gt;&lt;/p&gt;&lt;/li&gt;&lt;/ul&gt;&lt;ul style=&quot;margin-bottom: 0px; margin-top: 0px; padding-inline-start: 48px;&quot;&gt;&lt;li dir=&quot;ltr&quot; style=&quot;font-family: Arial, sans-serif; font-size: 11pt; margin-left: 36pt; vertical-align: baseline; white-space: pre;&quot;&gt;&lt;p dir=&quot;ltr&quot; style=&quot;line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;&quot;&gt;&lt;span style=&quot;font-size: 11pt; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;&quot;&gt;In-domain tasks&lt;/span&gt;&lt;span style=&quot;font-size: 11pt; vertical-align: baseline; white-space: pre-wrap;&quot;&gt;: MemoRAG demonstrates very solid gains over the comparator models in domains like Law and Finance, where precision and integration of information are key. As a simple example, in tasks dealing with legal matters, the memory model of MemoRAG allowed it to infer better relations between clauses.&lt;/span&gt;&lt;/p&gt;&lt;/li&gt;&lt;li dir=&quot;ltr&quot; style=&quot;font-family: Arial, sans-serif; font-size: 11pt; margin-left: 36pt; vertical-align: baseline; white-space: pre;&quot;&gt;&lt;p dir=&quot;ltr&quot; style=&quot;line-height: 1.38; margin-bottom: 12pt; margin-top: 0pt;&quot;&gt;&lt;span style=&quot;font-size: 11pt; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;&quot;&gt;Out-of-domain tasks&lt;/span&gt;&lt;span style=&quot;font-size: 11pt; vertical-align: baseline; white-space: pre-wrap;&quot;&gt;: The performance of MemoRAG was also very impressive across out-of-domain settings, including summarization of philosophical works or fiction. It has secured a 2.3% gain compared to the top-performing baselines over the Fiction dataset by effectively synthesizing high-level themes and insights from long-context information.&lt;/span&gt;&lt;/p&gt;&lt;/li&gt;&lt;/ul&gt;&lt;p dir=&quot;ltr&quot; style=&quot;line-height: 1.38; margin-bottom: 12pt; margin-top: 12pt;&quot;&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;background-color: white; font-size: 11pt; vertical-align: baseline; white-space: pre-wrap;&quot;&gt;Assessment metrics varied depending on the task but F1 score for questions and Rouge-L for summaries was the most common.&lt;/span&gt;&lt;/p&gt;&lt;p dir=&quot;ltr&quot; style=&quot;line-height: 1.38; margin-bottom: 12pt; margin-top: 12pt;&quot;&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;background-color: white; font-size: 11pt; vertical-align: baseline; white-space: pre-wrap;&quot;&gt;MemoRAG’s performance was benchmarked against several modern baselines:&lt;/span&gt;&lt;/p&gt;&lt;ul style=&quot;margin-bottom: 0px; margin-top: 0px; padding-inline-start: 48px;&quot;&gt;&lt;li dir=&quot;ltr&quot; style=&quot;font-family: Arial, sans-serif; font-size: 11pt; vertical-align: baseline; white-space: pre;&quot;&gt;&lt;p dir=&quot;ltr&quot; style=&quot;line-height: 1.38; margin-bottom: 0pt; margin-top: 12pt;&quot;&gt;&lt;span style=&quot;background-color: white; font-size: 11pt; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;&quot;&gt;Full Context Processing&lt;/span&gt;&lt;span style=&quot;background-color: white; font-size: 11pt; vertical-align: baseline; white-space: pre-wrap;&quot;&gt;: Conveniently exposing the full context to large language models.&lt;/span&gt;&lt;/p&gt;&lt;/li&gt;&lt;li dir=&quot;ltr&quot; style=&quot;font-family: Arial, sans-serif; font-size: 11pt; vertical-align: baseline; white-space: pre;&quot;&gt;&lt;p dir=&quot;ltr&quot; style=&quot;line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;&quot;&gt;&lt;span style=&quot;background-color: white; font-size: 11pt; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;&quot;&gt;General RAG Approaches&lt;/span&gt;&lt;span style=&quot;background-color: white; font-size: 11pt; vertical-align: baseline; white-space: pre-wrap;&quot;&gt;: &lt;/span&gt;&lt;/p&gt;&lt;/li&gt;&lt;ul style=&quot;margin-bottom: 0px; margin-top: 0px; padding-inline-start: 48px;&quot;&gt;&lt;li dir=&quot;ltr&quot; style=&quot;font-family: Arial, sans-serif; font-size: 11pt; vertical-align: baseline; white-space: pre;&quot;&gt;&lt;p dir=&quot;ltr&quot; style=&quot;line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;&quot;&gt;&lt;a href=&quot;https://doi.org/10.48550/arXiv.2402.03216&quot; style=&quot;text-decoration: none;&quot;&gt;&lt;span style=&quot;background-color: white; color: #1155cc; font-size: 11pt; text-decoration: underline; vertical-align: baseline; white-space: pre-wrap;&quot;&gt;BGE-M3&lt;/span&gt;&lt;/a&gt;&lt;span style=&quot;background-color: white; font-size: 11pt; vertical-align: baseline; white-space: pre-wrap;&quot;&gt;: A generic retriever.&lt;/span&gt;&lt;/p&gt;&lt;/li&gt;&lt;li dir=&quot;ltr&quot; style=&quot;font-family: Arial, sans-serif; font-size: 11pt; vertical-align: baseline; white-space: pre;&quot;&gt;&lt;p dir=&quot;ltr&quot; style=&quot;line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;&quot;&gt;&lt;a href=&quot;https://doi.org/10.48550/arXiv.2404.00610&quot; style=&quot;text-decoration: none;&quot;&gt;&lt;span style=&quot;background-color: white; color: #1155cc; font-size: 11pt; text-decoration: underline; vertical-align: baseline; white-space: pre-wrap;&quot;&gt;RQ-RAG&lt;/span&gt;&lt;/a&gt;&lt;span style=&quot;background-color: white; font-size: 11pt; vertical-align: baseline; white-space: pre-wrap;&quot;&gt;: A query optimization framework which reduces queries into simpler subqueries.&lt;/span&gt;&lt;/p&gt;&lt;/li&gt;&lt;li dir=&quot;ltr&quot; style=&quot;font-family: Arial, sans-serif; font-size: 11pt; vertical-align: baseline; white-space: pre;&quot;&gt;&lt;p dir=&quot;ltr&quot; style=&quot;line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;&quot;&gt;&lt;a href=&quot;https://doi.org/10.48550/arXiv.2212.10496&quot; style=&quot;text-decoration: none;&quot;&gt;&lt;span style=&quot;background-color: white; color: #1155cc; font-size: 11pt; text-decoration: underline; vertical-align: baseline; white-space: pre-wrap;&quot;&gt;HyDE&lt;/span&gt;&lt;/a&gt;&lt;span style=&quot;background-color: white; font-size: 11pt; vertical-align: baseline; white-space: pre-wrap;&quot;&gt;: Creates fake documents for navigating the retrieved data.&lt;/span&gt;&lt;/p&gt;&lt;/li&gt;&lt;/ul&gt;&lt;li dir=&quot;ltr&quot; style=&quot;font-family: Arial, sans-serif; font-size: 11pt; vertical-align: baseline; white-space: pre;&quot;&gt;&lt;p dir=&quot;ltr&quot; style=&quot;line-height: 1.38; margin-bottom: 12pt; margin-top: 0pt;&quot;&gt;&lt;span style=&quot;background-color: white; font-size: 11pt; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;&quot;&gt;Alternative Generative Models&lt;/span&gt;&lt;span style=&quot;background-color: white; font-size: 11pt; vertical-align: baseline; white-space: pre-wrap;&quot;&gt;: MemoRAG’s generator model was compared to popular models such as &lt;/span&gt;&lt;a href=&quot;https://mistral.ai/news/announcing-mistral-7b/&quot; style=&quot;text-decoration: none;&quot;&gt;&lt;span style=&quot;background-color: white; color: #1155cc; font-size: 11pt; text-decoration: underline; vertical-align: baseline; white-space: pre-wrap;&quot;&gt;Mistral-7B&lt;/span&gt;&lt;/a&gt;&lt;span style=&quot;background-color: white; font-size: 11pt; vertical-align: baseline; white-space: pre-wrap;&quot;&gt; and &lt;/span&gt;&lt;a href=&quot;https://github.com/meta-llama/llama3&quot; style=&quot;text-decoration: none;&quot;&gt;&lt;span style=&quot;background-color: white; color: #1155cc; font-size: 11pt; text-decoration: underline; vertical-align: baseline; white-space: pre-wrap;&quot;&gt;Llama3-8B&lt;/span&gt;&lt;/a&gt;&lt;span style=&quot;background-color: white; font-size: 11pt; vertical-align: baseline; white-space: pre-wrap;&quot;&gt;.&lt;/span&gt;&lt;/p&gt;&lt;/li&gt;&lt;/ul&gt;&lt;p dir=&quot;ltr&quot; style=&quot;line-height: 1.38; margin-bottom: 12pt; margin-top: 12pt;&quot;&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;font-size: 11pt; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;&quot;&gt;Multi-Hop Reasoning and Distributed Evidence&lt;/span&gt;&lt;/p&gt;&lt;p dir=&quot;ltr&quot; style=&quot;line-height: 1.38; margin-bottom: 12pt; margin-top: 12pt;&quot;&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;font-size: 11pt; vertical-align: baseline; white-space: pre-wrap;&quot;&gt;MemoRAG is unique in that it can support multi-hop reasoning, meaning that it draws evidence from multiple sources in order to formulate answers to complex questions. Whereas traditional RAG models evaluate each query in isolation, the memory component of MemoRAG enables information to be fetched and synthesized from a set of distributed sources. This is particularly helpful in domain-specific areas such as finance, where determining factors for revenue growth requires synthesis from a number of reports and inferences from a set of relationships such as expanded market and reduced cost.&lt;/span&gt;&lt;/p&gt;&lt;p dir=&quot;ltr&quot; style=&quot;line-height: 1.38; margin-bottom: 12pt; margin-top: 12pt;&quot;&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;font-size: 11pt; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;&quot;&gt;Enhanced Summarization Capabilities&lt;/span&gt;&lt;/p&gt;&lt;p dir=&quot;ltr&quot; style=&quot;line-height: 1.38; margin-bottom: 12pt; margin-top: 12pt;&quot;&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;font-size: 11pt; vertical-align: baseline; white-space: pre-wrap;&quot;&gt;MemoRAG did a great job summarizing unstructured and long documents like government reports, legal contracts, and academic papers. Because of its global memory capabilities, summaries are more concise, detailed, and accurate than those generated by baseline systems. This makes the system particularly useful in areas of journalism, law, and academia that require a great amount of data synthesis.&lt;/span&gt;&lt;/p&gt;&lt;p dir=&quot;ltr&quot; style=&quot;line-height: 1.38; margin-bottom: 12pt; margin-top: 12pt;&quot;&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;font-size: 11pt; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;&quot;&gt;Innovations in Memory Integration&lt;/span&gt;&lt;/p&gt;&lt;p dir=&quot;ltr&quot; style=&quot;line-height: 1.38; margin-bottom: 12pt; margin-top: 12pt;&quot;&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;font-size: 11pt; vertical-align: baseline; white-space: pre-wrap;&quot;&gt;MemoRAG encodes its tokens with crucial aspects of large-scale contexts. These tokens serve as the ways that enable the model to recall information from memory. This makes the model highly efficient on long-context tasks. Regarding training, MemoRAG involves a two-step process: unsupervised pre-training over large datasets and task-specific supervised fine-tuning. This guarantees that memory tokens are optimized for the generation of relevant clues with the goal of improving retrieval accuracy and speed.&lt;/span&gt;&lt;/p&gt;&lt;p dir=&quot;ltr&quot; style=&quot;line-height: 1.38; margin-bottom: 12pt; margin-top: 12pt;&quot;&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;font-size: 11pt; vertical-align: baseline; white-space: pre-wrap;&quot;&gt;ULTRADOMAIN benchmark results showed MemoRAG making massive gains:&lt;/span&gt;&lt;/p&gt;&lt;ul style=&quot;margin-bottom: 0px; margin-top: 0px; padding-inline-start: 48px;&quot;&gt;&lt;li dir=&quot;ltr&quot; style=&quot;font-family: Arial, sans-serif; font-size: 11pt; vertical-align: baseline; white-space: pre;&quot;&gt;&lt;p dir=&quot;ltr&quot; style=&quot;line-height: 1.38; margin-bottom: 0pt; margin-top: 12pt;&quot;&gt;&lt;span style=&quot;font-size: 11pt; vertical-align: baseline; white-space: pre-wrap;&quot;&gt;On in-domain datasets (i.e., legal and financial contexts), MemoRAG scored an average F1 improvement of 9.7 points on the best-performing baseline.&lt;/span&gt;&lt;/p&gt;&lt;/li&gt;&lt;li dir=&quot;ltr&quot; style=&quot;font-family: Arial, sans-serif; font-size: 11pt; vertical-align: baseline; white-space: pre;&quot;&gt;&lt;p dir=&quot;ltr&quot; style=&quot;line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;&quot;&gt;&lt;span style=&quot;font-size: 11pt; vertical-align: baseline; white-space: pre-wrap;&quot;&gt;MemoRAG outperformed baselines across 18 disciplines on out-of-domain datasets by an average of 2.4 points (showcasing its versatility). For example:&lt;/span&gt;&lt;/p&gt;&lt;/li&gt;&lt;ul style=&quot;margin-bottom: 0px; margin-top: 0px; padding-inline-start: 48px;&quot;&gt;&lt;li dir=&quot;ltr&quot; style=&quot;font-family: Arial, sans-serif; font-size: 11pt; vertical-align: baseline; white-space: pre;&quot;&gt;&lt;p dir=&quot;ltr&quot; style=&quot;line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;&quot;&gt;&lt;span style=&quot;font-size: 11pt; vertical-align: baseline; white-space: pre-wrap;&quot;&gt;For financial data, MemoRAG scored 48.0 F1 score (40.8 by the nearest baseline).&lt;/span&gt;&lt;/p&gt;&lt;/li&gt;&lt;li dir=&quot;ltr&quot; style=&quot;font-family: Arial, sans-serif; font-size: 11pt; vertical-align: baseline; white-space: pre;&quot;&gt;&lt;p dir=&quot;ltr&quot; style=&quot;line-height: 1.38; margin-bottom: 12pt; margin-top: 0pt;&quot;&gt;&lt;span style=&quot;font-size: 11pt; vertical-align: baseline; white-space: pre-wrap;&quot;&gt;In the long-context domain (e.g., college textbooks), MemoRAG did not degrade much even at extremely long context lengths.&lt;/span&gt;&lt;/p&gt;&lt;/li&gt;&lt;/ul&gt;&lt;/ul&gt;&lt;p dir=&quot;ltr&quot; style=&quot;line-height: 1.38; margin-bottom: 12pt; margin-top: 12pt;&quot;&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;font-size: 11pt; vertical-align: baseline; white-space: pre-wrap;&quot;&gt;MemoRAG’s advantage lies in its new clue-driven retrieval algorithm and memory model implementation. In contrast to conventional RAGs that find unreliable or redundant evidence, MemoRAG:&lt;/span&gt;&lt;/p&gt;&lt;ul style=&quot;margin-bottom: 0px; margin-top: 0px; padding-inline-start: 48px;&quot;&gt;&lt;li dir=&quot;ltr&quot; style=&quot;font-family: Arial, sans-serif; font-size: 11pt; vertical-align: baseline; white-space: pre;&quot;&gt;&lt;p dir=&quot;ltr&quot; style=&quot;line-height: 1.38; margin-bottom: 0pt; margin-top: 12pt;&quot;&gt;&lt;span style=&quot;font-size: 11pt; vertical-align: baseline; white-space: pre-wrap;&quot;&gt;Finds semantically rich clues to refine retrieval.&lt;/span&gt;&lt;/p&gt;&lt;/li&gt;&lt;li dir=&quot;ltr&quot; style=&quot;font-family: Arial, sans-serif; font-size: 11pt; vertical-align: baseline; white-space: pre;&quot;&gt;&lt;p dir=&quot;ltr&quot; style=&quot;line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;&quot;&gt;&lt;span style=&quot;font-size: 11pt; vertical-align: baseline; white-space: pre-wrap;&quot;&gt;Analyzes long contexts more effectively, without the truncation complications that conventional systems have.&lt;/span&gt;&lt;/p&gt;&lt;/li&gt;&lt;li dir=&quot;ltr&quot; style=&quot;font-family: Arial, sans-serif; font-size: 11pt; vertical-align: baseline; white-space: pre;&quot;&gt;&lt;p dir=&quot;ltr&quot; style=&quot;line-height: 1.38; margin-bottom: 12pt; margin-top: 0pt;&quot;&gt;&lt;span style=&quot;font-size: 11pt; vertical-align: baseline; white-space: pre-wrap;&quot;&gt;Works well on difficult complex problems like multi-hop reasoning and summarization, where other solutions fall flat.&lt;/span&gt;&lt;/p&gt;&lt;/li&gt;&lt;/ul&gt;&lt;p dir=&quot;ltr&quot; style=&quot;line-height: 1.38; margin-bottom: 12pt; margin-top: 12pt;&quot;&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;font-size: 11pt; vertical-align: baseline; white-space: pre-wrap;&quot;&gt;MemoRAG’s strategy of combining retrieval and generation of memory has important implications for developing artificial intelligence and its uses:&lt;/span&gt;&lt;/p&gt;&lt;ul style=&quot;margin-bottom: 0px; margin-top: 0px; padding-inline-start: 48px;&quot;&gt;&lt;li dir=&quot;ltr&quot; style=&quot;font-family: Arial, sans-serif; font-size: 11pt; vertical-align: baseline; white-space: pre;&quot;&gt;&lt;p dir=&quot;ltr&quot; style=&quot;line-height: 1.38; margin-bottom: 0pt; margin-top: 12pt;&quot;&gt;&lt;span style=&quot;font-size: 11pt; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;&quot;&gt;Scalability of AI Models&lt;/span&gt;&lt;span style=&quot;font-size: 11pt; vertical-align: baseline; white-space: pre-wrap;&quot;&gt;: MemoRAG shows how memory modules extend the optimal context window of LLMs and allow them to consume datasets which were once considered too large. This would open the door for cheap AI solutions that can outperform big models without significantly more computation power.&lt;/span&gt;&lt;/p&gt;&lt;/li&gt;&lt;li dir=&quot;ltr&quot; style=&quot;font-family: Arial, sans-serif; font-size: 11pt; vertical-align: baseline; white-space: pre;&quot;&gt;&lt;p dir=&quot;ltr&quot; style=&quot;line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;&quot;&gt;&lt;span style=&quot;font-size: 11pt; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;&quot;&gt;Better Knowledge Discovery&lt;/span&gt;&lt;span style=&quot;font-size: 11pt; vertical-align: baseline; white-space: pre-wrap;&quot;&gt;: By enabling an exchange of raw data to meaningful information, MemoRAG can support knowledge-intensive activities such as scientific research, policy analysis, and technical writing.&lt;/span&gt;&lt;/p&gt;&lt;/li&gt;&lt;li dir=&quot;ltr&quot; style=&quot;font-family: Arial, sans-serif; font-size: 11pt; vertical-align: baseline; white-space: pre;&quot;&gt;&lt;p dir=&quot;ltr&quot; style=&quot;line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;&quot;&gt;&lt;span style=&quot;font-size: 11pt; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;&quot;&gt;Personalization&lt;/span&gt;&lt;span style=&quot;font-size: 11pt; vertical-align: baseline; white-space: pre-wrap;&quot;&gt;: Since it’s built with and stores long-term memory, MemoRAG can match individual user preferences and records to create personalized recommendations for entertainment, e-learning, and e-commerce.&lt;/span&gt;&lt;/p&gt;&lt;/li&gt;&lt;li dir=&quot;ltr&quot; style=&quot;font-family: Arial, sans-serif; font-size: 11pt; vertical-align: baseline; white-space: pre;&quot;&gt;&lt;p dir=&quot;ltr&quot; style=&quot;line-height: 1.38; margin-bottom: 12pt; margin-top: 0pt;&quot;&gt;&lt;span style=&quot;font-size: 11pt; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;&quot;&gt;Future AI Research&lt;/span&gt;&lt;span style=&quot;font-size: 11pt; vertical-align: baseline; white-space: pre-wrap;&quot;&gt;: MemoRAG’s framework revolutionizes retrieval-based systems with its memory-inspired design. It might spur more research into hybrid architectures that integrate retrieval and memory better.&lt;/span&gt;&lt;/p&gt;&lt;/li&gt;&lt;/ul&gt;&lt;p dir=&quot;ltr&quot; style=&quot;line-height: 1.38; margin-bottom: 12pt; margin-top: 12pt;&quot;&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;font-size: 11pt; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;&quot;&gt;Conclusion: MemoRAG and the Future of Retrieval-Augmented AI&lt;/span&gt;&lt;/p&gt;&lt;p dir=&quot;ltr&quot; style=&quot;line-height: 1.38; margin-bottom: 12pt; margin-top: 12pt;&quot;&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;font-size: 11pt; vertical-align: baseline; white-space: pre-wrap;&quot;&gt;MemoRAG has addressed the fundamental drawbacks of traditional RAG architecture by adopting a memory-based architecture. With a memory module that produces global, context-dependent representations, MemoRAG is perfect for tasks involving complicated logic, extended contexts, and evidence gathering. &lt;/span&gt;&lt;/p&gt;&lt;p dir=&quot;ltr&quot; style=&quot;line-height: 1.38; margin-bottom: 12pt; margin-top: 12pt;&quot;&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;font-size: 11pt; vertical-align: baseline; white-space: pre-wrap;&quot;&gt;The framework scores very well on the ULTRADOMAIN benchmark for its versatility and power to handle both in-domain and out-of-domain datasets. Using it for everything from legal defense to conversational AI, MemoRAG reveals how memory-based systems can help to expand the limits of what a language model can be.&lt;/span&gt;&lt;/p&gt;&lt;p dir=&quot;ltr&quot; style=&quot;line-height: 1.38; margin-bottom: 12pt; margin-top: 12pt;&quot;&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;font-size: 11pt; vertical-align: baseline; white-space: pre-wrap;&quot;&gt;As AI systems mature, technologies such as MemoRAG could be the basis for handling even more sophisticated tasks, straddling the divide between retrieval and complex reasoning. My work on ALICE involves doing similar retrieval and clue systems via knowledge graph structure and the techniques used by MemoRAG will help inform ALICE’s development by both providing a comparative methodology to evaluate against and hints on how to achieve greater information coherence for truth grounded response. This performance by MemoRAG is a reflection of the next generation of AI systems that will be able to retrieve and generate information but also remember, infer, and synthesize knowledge in ways that are becoming increasingly indistinguishable from human cognition. As AI continues to evolve, it is models such as MemoRAG that will lie at the heart of nuanced information-based tasks characteristic of the modern world.&lt;/span&gt;&lt;/p&gt;&lt;br /&gt;&lt;ul style=&quot;margin-bottom: 0px; margin-top: 0px; padding-inline-start: 48px;&quot;&gt;&lt;li dir=&quot;ltr&quot; style=&quot;font-family: Arial, sans-serif; font-size: 11pt; vertical-align: baseline; white-space: pre;&quot;&gt;&lt;p dir=&quot;ltr&quot; style=&quot;line-height: 1.38; margin-bottom: 12pt; margin-top: 12pt;&quot;&gt;&lt;span style=&quot;font-size: 11pt; vertical-align: baseline; white-space: pre-wrap;&quot;&gt;Jim Ecker&lt;/span&gt;&lt;/p&gt;&lt;/li&gt;&lt;/ul&gt;&lt;p dir=&quot;ltr&quot; style=&quot;line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;&quot;&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;font-size: 11pt; vertical-align: baseline; white-space: pre-wrap;&quot;&gt;References&lt;/span&gt;&lt;/p&gt;&lt;br /&gt;&lt;p style=&quot;font-family: Arial, sans-serif; font-size: 11pt; line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt; text-align: left; white-space: pre;&quot;&gt;&lt;/p&gt;&lt;ol&gt;&lt;li&gt;&lt;span style=&quot;font-size: 11pt; vertical-align: baseline; white-space: pre-wrap;&quot;&gt;Qian, H., Zhang, P., Liu, Z., Mao, K., and Dou, Z., “MemoRAG: Moving towards Next-Gen RAG Via Memory-Inspired Knowledge Discovery”, 2024. URL &lt;/span&gt;&lt;a href=&quot;https://doi.org/10.48550/arXiv.2409.05591&quot; style=&quot;font-size: 11pt; text-decoration: none;&quot;&gt;&lt;span style=&quot;color: #1155cc; font-size: 11pt; text-decoration: underline; vertical-align: baseline; white-space: pre-wrap;&quot;&gt;https://doi.org/10.48550/arXiv.2409.05591&lt;/span&gt;&lt;/a&gt;&lt;span style=&quot;font-size: 11pt; vertical-align: baseline; white-space: pre-wrap;&quot;&gt;.&lt;/span&gt;&lt;/li&gt;&lt;li&gt;&lt;span style=&quot;font-size: 11pt; vertical-align: baseline; white-space: pre-wrap;&quot;&gt;Lewis P., Perez E., Piktus A., Petroni F., Karpukhin V., Goyal N., Kuttler H., et al., “Retrieval-Augmented Generation for Knowledge-Intensive NLP Tasks”, 2020, URL &lt;/span&gt;&lt;a href=&quot;https://doi.org/10.48550/arXiv.2005.11401&quot; style=&quot;font-size: 11pt; text-decoration: none;&quot;&gt;&lt;span style=&quot;color: #1155cc; font-size: 11pt; text-decoration: underline; vertical-align: baseline; white-space: pre-wrap;&quot;&gt;https://doi.org/10.48550/arXiv.2005.11401&lt;/span&gt;&lt;/a&gt;&lt;/li&gt;&lt;li&gt;&lt;span style=&quot;font-size: 11pt; vertical-align: baseline; white-space: pre-wrap;&quot;&gt;Saleh, Ahmmad O. M., Tur, Gokhan., Saygin, Yucel., “SG-RAG: Multi-Hop Question Answering with Large Language Models Through Knowledge Graphs,” ICNLSP 2024. URL &lt;/span&gt;&lt;a href=&quot;https://aclanthology.org/2024.icnlsp-1.45/&quot; style=&quot;font-size: 11pt; text-decoration: none;&quot;&gt;&lt;span style=&quot;color: #1155cc; font-size: 11pt; text-decoration: underline; vertical-align: baseline; white-space: pre-wrap;&quot;&gt;2024.icnlsp-1.45&lt;/span&gt;&lt;/a&gt;&lt;/li&gt;&lt;li&gt;&lt;span style=&quot;font-size: 11pt; vertical-align: baseline; white-space: pre-wrap;&quot;&gt;Weber, M., Fu, D., Anthony, Q., Oren, Y., Adams, S., Alexandrov, A., Lyu, X., Nguyen, H., Yao, X., Adams, V., Athiwaratkun, B., Chalamala, R., Chen, K., Ryabinin, M., Dao, T., Liang, P., Ré, C., Rish, I., and Zhang, C., “RedPajama: an Open Dataset for Training Large Language Models”, 2024. URL &lt;/span&gt;&lt;a href=&quot;https://doi.org/10.48550/arXiv.2411.12372&quot; style=&quot;font-size: 11pt; text-decoration: none;&quot;&gt;&lt;span style=&quot;color: #1155cc; font-size: 11pt; text-decoration: underline; vertical-align: baseline; white-space: pre-wrap;&quot;&gt;https://doi.org/10.48550/arXiv.2411.12372&lt;/span&gt;&lt;/a&gt;&lt;span style=&quot;font-size: 11pt; vertical-align: baseline; white-space: pre-wrap;&quot;&gt;.&lt;/span&gt;&lt;/li&gt;&lt;li&gt;&lt;span style=&quot;font-size: 11pt; vertical-align: baseline; white-space: pre-wrap;&quot;&gt;Kočiský, T., Schwarz, J., Blunsom, P., Dyer, C., Hermann, K. M., Melis, G., and Grefenstette, E., “The NarrativeQA Reading Comprehension Challenge,” , 2017. URL &lt;/span&gt;&lt;a href=&quot;https://doi.org/10.48550/arXiv.1712.07040&quot; style=&quot;font-size: 11pt; text-decoration: none;&quot;&gt;&lt;span style=&quot;color: #1155cc; font-size: 11pt; text-decoration: underline; vertical-align: baseline; white-space: pre-wrap;&quot;&gt;https://doi.org/10.48550/arXiv.1712.07040&lt;/span&gt;&lt;/a&gt;&lt;span style=&quot;font-size: 11pt; vertical-align: baseline; white-space: pre-wrap;&quot;&gt;.&lt;/span&gt;&lt;/li&gt;&lt;li&gt;&lt;span style=&quot;font-size: 11pt; vertical-align: baseline; white-space: pre-wrap;&quot;&gt;Yang, Z., Qi, P., Zhang, S., Bengio, Y., Cohen, W. W., Salakhutdinov, R., and Manning, C. D., “HotpotQA: A Dataset for Diverse, Explainable Multi-hop Question Answering” , 2018. URL &lt;/span&gt;&lt;a href=&quot;https://doi.org/10.48550/arXiv.1809.09600&quot; style=&quot;font-size: 11pt; text-decoration: none;&quot;&gt;&lt;span style=&quot;color: #1155cc; font-size: 11pt; text-decoration: underline; vertical-align: baseline; white-space: pre-wrap;&quot;&gt;https://doi.org/10.48550/arXiv.1809.09600&lt;/span&gt;&lt;/a&gt;&lt;span style=&quot;font-size: 11pt; vertical-align: baseline; white-space: pre-wrap;&quot;&gt;.&lt;/span&gt;&lt;/li&gt;&lt;li&gt;&lt;span style=&quot;font-size: 11pt; vertical-align: baseline; white-space: pre-wrap;&quot;&gt;Chen, J., Xiao, S., Zhang, P., Luo, K., Lian, D., and Liu, Z., “BGE M3-Embedding: Multi-Lingual, Multi-Functionality, Multi-Granularity Text Embeddings Through Self-Knowledge Distillation” , 2024. URL &lt;/span&gt;&lt;a href=&quot;https://doi.org/10.48550/arXiv.2402.03216&quot; style=&quot;font-size: 11pt; text-decoration: none;&quot;&gt;&lt;span style=&quot;color: #1155cc; font-size: 11pt; text-decoration: underline; vertical-align: baseline; white-space: pre-wrap;&quot;&gt;https://doi.org/10.48550/arXiv.2402.03216&lt;/span&gt;&lt;/a&gt;&lt;span style=&quot;font-size: 11pt; vertical-align: baseline; white-space: pre-wrap;&quot;&gt;.&lt;/span&gt;&lt;/li&gt;&lt;li&gt;&lt;span style=&quot;font-size: 11pt; vertical-align: baseline; white-space: pre-wrap;&quot;&gt;Chan, C.-M., Xu, C., Yuan, R., Luo, H., Xue, W., Guo, Y., and Fu, J., “RQ-RAG: Learning to Refine Queries for Retrieval Augmented Generation” , 2024. URL &lt;/span&gt;&lt;a href=&quot;https://doi.org/10.48550/arXiv.2404.00610&quot; style=&quot;font-size: 11pt; text-decoration: none;&quot;&gt;&lt;span style=&quot;color: #1155cc; font-size: 11pt; text-decoration: underline; vertical-align: baseline; white-space: pre-wrap;&quot;&gt;https://doi.org/10.48550/arXiv.2404.00610&lt;/span&gt;&lt;/a&gt;&lt;span style=&quot;font-size: 11pt; vertical-align: baseline; white-space: pre-wrap;&quot;&gt;.&lt;/span&gt;&lt;/li&gt;&lt;li&gt;&lt;span style=&quot;font-size: 11pt; vertical-align: baseline; white-space: pre-wrap;&quot;&gt;Gao, L., Ma, X., Lin, J., and Callan, J., “Precise Zero-Shot Dense Retrieval without Relevance Labels” , 2022. URL &lt;/span&gt;&lt;a href=&quot;https://doi.org/10.48550/arXiv.2212.10496&quot; style=&quot;font-size: 11pt; text-decoration: none;&quot;&gt;&lt;span style=&quot;color: #1155cc; font-size: 11pt; text-decoration: underline; vertical-align: baseline; white-space: pre-wrap;&quot;&gt;https://doi.org/10.48550/arXiv.2212.10496&lt;/span&gt;&lt;/a&gt;&lt;span style=&quot;font-size: 11pt; vertical-align: baseline; white-space: pre-wrap;&quot;&gt;.&lt;/span&gt;&lt;/li&gt;&lt;li&gt;&lt;span style=&quot;font-size: 11pt; vertical-align: baseline; white-space: pre-wrap;&quot;&gt;Jiang, A. Q., Sablayrolles, A., Mensch, A., Bamford, C., Chaplot, D. S., de las Casas, D., Bressand, F., Lengyel, G., Lample, G., Saulnier, L., Lavaud, L. R., Lachaux, M.-A., Stock, P., Scao, T. L., Lavril, T., Wang, T., Lacroix, T., and Sayed, W. E., “Mistral 7B” 2023. URL &lt;/span&gt;&lt;span style=&quot;color: #1155cc; font-size: 11pt; text-decoration: underline; vertical-align: baseline; white-space: pre-wrap;&quot;&gt;&lt;a href=&quot;https://doi.org/10.48550/arXiv.2310.06825&quot; style=&quot;font-size: 11pt; text-decoration: none;&quot;&gt;https://doi.org/10.48550/arXiv.2310.06825&lt;/a&gt;&lt;/span&gt;&lt;/li&gt;&lt;li&gt;&lt;span style=&quot;font-size: 11pt; vertical-align: baseline; white-space: pre-wrap;&quot;&gt;Grattafiori, A., Dubey, A., Jauhri, A., Pandey, A., Kadian, et al., “The Llama 3 Herd of Models” , 2024. URL &lt;/span&gt;&lt;a href=&quot;https://doi.org/10.48550/arXiv.2407.21783&quot; style=&quot;font-size: 11pt; text-decoration: none;&quot;&gt;&lt;span style=&quot;color: #1155cc; font-size: 11pt; text-decoration: underline; vertical-align: baseline; white-space: pre-wrap;&quot;&gt;https://doi.org/10.48550/arXiv.2407.21783.&lt;/span&gt;&lt;/a&gt;&lt;/li&gt;&lt;/ol&gt;&lt;p&gt;&lt;/p&gt;&lt;/span&gt;</content>
  671.      <author>
  672.  <name>Jim E. (noreply@blogger.com)</name>
  673. <uri>https://ws-dl.blogspot.com/</uri>
  674.      </author>
  675.    </entry>
  676.  
  677.    <entry>
  678.      <title>Mita Williams: How I Use Zotero (and Why)</title>
  679.      <link rel="alternate" type="text/html" href="https://librarian.aedileworks.com/2025/06/26/how-i-use-zotero-and-why/"/>
  680.      <id>https://librarian.aedileworks.com/?p=1910</id>
  681.      <updated>2025-06-26T17:53:10+00:00</updated>
  682.      <content type="html">I use Zotero when I am doing research or scholarly writing but I also have a handful of other reasons why I use Zotero which I thought I'd share.</content>
  683.      <author>
  684.  <name>Mita Williams</name>
  685. <uri>https://librarian.aedileworks.com</uri>
  686.      </author>
  687.    </entry>
  688.  
  689.    <entry>
  690.      <title>HangingTogether: Leading through uncertainty: Fostering morale and connection in challenging times</title>
  691.      <link rel="alternate" type="text/html" href="https://hangingtogether.org/leading-through-uncertainty-fostering-morale-and-connection-in-challenging-times/"/>
  692.      <id>https://hangingtogether.org/?p=16382</id>
  693.      <updated>2025-06-26T14:35:26+00:00</updated>
  694.      <content type="html">&lt;p class=&quot;has-small-font-size&quot;&gt;&lt;em&gt;This post is one in a &lt;/em&gt;&lt;a href=&quot;https://hangingtogether.org/tag/RLP-roundtables/&quot; rel=&quot;noreferrer noopener&quot; target=&quot;_blank&quot;&gt;&lt;em&gt;series&lt;/em&gt;&lt;/a&gt;&lt;em&gt; documenting findings from the &lt;/em&gt;&lt;a href=&quot;https://www.oclc.org/research/partnership/engagement/rlp-leadership-roundtables.html&quot; rel=&quot;noreferrer noopener&quot; target=&quot;_blank&quot;&gt;&lt;em&gt;RLP Leadership Roundtable&lt;/em&gt;&lt;/a&gt;&lt;em&gt; discussions. It is co-authored by Rebecca Bryant and Chela Weber.&lt;/em&gt; &lt;em&gt;It is the first of two blog posts that summarize outcomes from discussions focused on leadership in times of uncertainty and change. &lt;/em&gt;&lt;/p&gt;
  695.  
  696.  
  697. &lt;div class=&quot;wp-block-image&quot;&gt;
  698. &lt;figure class=&quot;alignright size-full is-resized&quot;&gt;&lt;a href=&quot;https://hangingtogether.org/wp-content/uploads/2025/04/image.png&quot;&gt;&lt;img alt=&quot;&quot; class=&quot;wp-image-16106&quot; height=&quot;334&quot; src=&quot;https://hangingtogether.org/wp-content/uploads/2025/04/image.png&quot; style=&quot;width: 345px; height: auto;&quot; width=&quot;540&quot; /&gt;&lt;/a&gt;&lt;/figure&gt;&lt;/div&gt;
  699.  
  700.  
  701. &lt;p&gt;Leading people and programs can be especially challenging during times of uncertainty and change. As many libraries grapple with budget constraints, staffing shortages, and resource limitations, the &lt;a href=&quot;https://www.oclc.org/research/partnership.html&quot; rel=&quot;noreferrer noopener&quot; target=&quot;_blank&quot;&gt;OCLC Research Library Partnership (RLP) &lt;/a&gt;created space for leaders to share challenges, learn from peers, and provide mutual support. &lt;/p&gt;
  702.  
  703.  
  704.  
  705. &lt;p&gt;The RLP convened separate &lt;a href=&quot;https://www.oclc.org/research/partnership/engagement/rlp-leadership-roundtables.html&quot; rel=&quot;noreferrer noopener&quot; target=&quot;_blank&quot;&gt;roundtables for special collections and research support&lt;/a&gt; leaders, both focused on leading through uncertainty. Participants explored several key questions: &lt;/p&gt;
  706.  
  707.  
  708.  
  709. &lt;ol class=&quot;wp-block-list&quot;&gt;
  710. &lt;li&gt;What is the most challenging aspect of your work in a leadership role today? &lt;br /&gt;&lt;/li&gt;
  711.  
  712.  
  713.  
  714. &lt;li&gt;What’s been your approach to sunsetting or pivoting work on services, projects, or other activities, especially when those decisions are unpopular? Do you have underlying principles that guide your decision making? How do you balance short- and long-term priorities? &lt;br /&gt;&lt;/li&gt;
  715.  
  716.  
  717.  
  718. &lt;li&gt;How do you effectively and transparently communicate during challenging times? How do you share enough without sharing too much? Do you have tools for supporting staff morale during extended periods of staff uncertainty? &lt;br /&gt;&lt;/li&gt;
  719.  
  720.  
  721.  
  722. &lt;li&gt;We all dealt with the major crisis of the pandemic and the uncertainty it brought, some in leadership roles while others have come into leadership since. What have you learned from this or other past crises that you can apply now? What are you learning about leadership from observing senior leadership at your institution? Has anyone done anything that inspires you or lifts your own morale? &lt;/li&gt;
  723. &lt;/ol&gt;
  724.  
  725.  
  726.  
  727. &lt;p&gt;Fifty individuals from 43 RLP institutions across five countries participated in a total of eight 90-minute discussions. The discussions revealed that leadership challenges transcend national boundaries, though specific contexts vary. Common themes emerged around financial constraints, communication challenges, and the need for adaptive leadership strategies. RLP leadership roundtables observe the Chatham House Rule—no specific comments are attributed to any individual or institution, but a list of participating institutions is provided at the bottom of this post.&lt;/p&gt;
  728.  
  729.  
  730.  
  731. &lt;p&gt;The resulting discussions were so rich that we couldn’t capture them all in one blog post. This is part one of the outcomes of our leadership discussions, and part two will follow shortly.  &lt;/p&gt;
  732.  
  733.  
  734.  
  735. &lt;h2 class=&quot;wp-block-heading&quot;&gt;Budgetary and staffing pressures&lt;/h2&gt;
  736.  
  737.  
  738.  
  739. &lt;p&gt;Our discussions reveal a group of librarians and archivists grappling with a complex web of challenges while working to maintain operations and support staff.&lt;/p&gt;
  740.  
  741.  
  742.  
  743. &lt;p&gt;&lt;strong&gt;Budget constraints&lt;/strong&gt;. Library leaders face severe budget constraints as their most immediate challenge. Roundtable participants described how their organizations are facing cuts ranging from 5% to 33% for fiscal year 2026. One participant described their institution as already in a perilous condition following years of declining enrollment and budget recissions, and others described the precarity of state budgets.&lt;/p&gt;
  744.  
  745.  
  746.  
  747. &lt;p&gt;&lt;strong&gt;Uncertainty abounds. &lt;/strong&gt;One participant expressed concerns about how reductions to indirect costs will further impact library funding, prompting efforts to prepare campus faculty for potential library cuts. The uncertainty of when, and if, budgets will stabilize impacts morale.&lt;/p&gt;
  748.  
  749.  
  750. &lt;div class=&quot;wp-block-image&quot;&gt;
  751. &lt;figure class=&quot;alignright size-full is-resized&quot;&gt;&lt;a href=&quot;https://hangingtogether.org/wp-content/uploads/2025/06/144dcf3d-2680-428c-bce0-4693d172de21Original.jpg&quot;&gt;&lt;img alt=&quot;&quot; class=&quot;wp-image-16465&quot; height=&quot;449&quot; src=&quot;https://hangingtogether.org/wp-content/uploads/2025/06/144dcf3d-2680-428c-bce0-4693d172de21Original.jpg&quot; style=&quot;width: 388px; height: auto;&quot; width=&quot;599&quot; /&gt;&lt;/a&gt;&lt;em&gt;Bristlecone pine by Bain Butcher, 2013. 2013 Artist-in-Residence, Great Basin National Park, USA. &lt;/em&gt;&lt;br /&gt;&lt;/figure&gt;&lt;/div&gt;
  752.  
  753.  
  754. &lt;p&gt;&lt;strong&gt;Staffing pressures&lt;/strong&gt; &lt;strong&gt;compound these challenges&lt;/strong&gt;. Hiring freezes and budget constraints force many institutions to operate with skeleton crews. One leader who lost one-third of their personnel in the last two years cannot rebuild due to hiring restrictions. Another institution self-imposed a hiring freeze to avoid future position cuts. Retention is another struggle; one leader explained, “I have lost staff simply because they cannot afford [to live in our city] on the amount that they get paid at the library.”&lt;/p&gt;
  755.  
  756.  
  757.  
  758. &lt;p&gt;&lt;strong&gt;Operational impacts.&lt;/strong&gt; Budget uncertainty and staffing shortages create cascading effects throughout library operations. Library staff at many institutions feel overworked, stressed, and overwhelmed. Morale is low.  Staff take on new responsibilities without additional compensation, and several libraries reported reorganization efforts to address reduced capacity.&lt;/p&gt;
  759.  
  760.  
  761.  
  762. &lt;h2 class=&quot;wp-block-heading&quot;&gt;Strengthening morale through recognition and connection&lt;/h2&gt;
  763.  
  764.  
  765.  
  766. &lt;p&gt;Several participants described their efforts to build morale by engaging teams in meaningful work and providing authentic support.&lt;/p&gt;
  767.  
  768.  
  769.  
  770. &lt;p&gt;&lt;strong&gt;Acknowledge fear and uncertainty&lt;/strong&gt;. With staff feeling anxiety, “especially about job security,” several participants described the need to extend compassion and convey to staff that “it’s okay to not be okay.”&lt;/p&gt;
  771.  
  772.  
  773.  
  774. &lt;p&gt;&lt;strong&gt;Connect to core purpose.&lt;/strong&gt;  It’s important to maintain a connection to the core mission despite challenges. External engagement seems particularly motivating. One leader noted: “When you’re with other people who are excited about your material, it gets you excited again” about the work. Another helps “my direct reports find work projects that they feel . . . like they’re doing something that is meaningful to them beyond just the kind of cultural heritage work that we normally do.”&lt;/p&gt;
  775.  
  776.  
  777.  
  778. &lt;p&gt;&lt;strong&gt;Foster solutions-oriented thinking.&lt;/strong&gt; Leaders struggled with team expectations to provide all the solutions, seeking ways to encourage solutions-oriented thinking.&lt;/p&gt;
  779.  
  780.  
  781.  
  782. &lt;ul class=&quot;wp-block-list&quot;&gt;
  783. &lt;li&gt;&lt;strong&gt;Identify things you can’t fix.&lt;/strong&gt; One participant helps staff identify “gravity problems”—issues that, like gravity, are not going to change and shouldn’t consume energy. Instead, they urge focus on problems that can be meaningfully addressed.  &lt;/li&gt;
  784.  
  785.  
  786.  
  787. &lt;li&gt;&lt;strong&gt;Promote collaborative work. &lt;/strong&gt;Several leaders promote projects that allow staff to support each other rather than relying heavily on supervisory relationships. Through collaboration, “they see each other’s work, they are interdependent, they can support each other more.”&lt;/li&gt;
  788.  
  789.  
  790.  
  791. &lt;li&gt;&lt;strong&gt;Convene &lt;/strong&gt;solutions-focused meetings. Many use staff meetings to address roadblocks and encourage team members to “come with proposed solutions” rather than just problems.&lt;/li&gt;
  792. &lt;/ul&gt;
  793.  
  794.  
  795.  
  796. &lt;p&gt;&lt;strong&gt;Recognition and morale building. &lt;/strong&gt;To buoy morale, many libraries have implemented regular recognition practices:&lt;/p&gt;
  797.  
  798.  
  799.  
  800. &lt;ul class=&quot;wp-block-list&quot;&gt;
  801. &lt;li&gt;&lt;strong&gt;“60-second shout-outs”&lt;/strong&gt; start meetings positively by allowing quick recognition of valued work&lt;/li&gt;
  802.  
  803.  
  804.  
  805. &lt;li&gt;&lt;strong&gt;“Shout-out cards”&lt;/strong&gt; let staff publicly recognize the work of others on cards posted in the staff break room&lt;/li&gt;
  806.  
  807.  
  808.  
  809. &lt;li&gt;&lt;strong&gt;Celebrating achievements&lt;/strong&gt; like publications, awards, promotions, and personal accomplishments at regular library meetings&lt;/li&gt;
  810.  
  811.  
  812.  
  813. &lt;li&gt;&lt;strong&gt;Community events&lt;/strong&gt; or “engagement days” convene staff members for team building and wellness activities. One institution recently hosted a kitten adoption event&lt;/li&gt;
  814.  
  815.  
  816.  
  817. &lt;li&gt;&lt;strong&gt;Campus award nominations&lt;/strong&gt; leverage institutional HR staff awards programs—for example, one institution has fostered a positive culture of appreciation through regular award nominations and wins&lt;/li&gt;
  818. &lt;/ul&gt;
  819.  
  820.  
  821.  
  822. &lt;p&gt;The impact of these efforts extends beyond individual recognition: “It’s been gratifying to see how supportive people are of each other, and this kind of peer support is meaningful to a lot of people, not just the person getting the shout-out.”&lt;/p&gt;
  823.  
  824.  
  825.  
  826. &lt;h2 class=&quot;wp-block-heading&quot;&gt;Strategic approaches to uncertainty and change&lt;/h2&gt;
  827.  
  828.  
  829.  
  830. &lt;p&gt;Facing budget and staffing uncertainties, library leaders described strategic approaches that focus on core services, scenario planning, and selective implementation of new initiatives. They looked to professional values and institutional priorities to guide their decision-making.&lt;/p&gt;
  831.  
  832.  
  833.  
  834. &lt;h3 class=&quot;wp-block-heading&quot;&gt;Planning strategies&lt;/h3&gt;
  835.  
  836.  
  837.  
  838. &lt;p&gt;&lt;strong&gt;Scenario planning. &lt;/strong&gt;Participants’ institutions are conducting budget exercises to identify and maintain core services in anticipation of deep cuts. This “war footing” approach prioritizes core services with hopes of rebuilding later. Some included staff in documenting potential responses to different funding levels, facilitating informed decision-making. Several emphasized the need to take a library-wide view of cuts due to significant cross-departmental impacts.&lt;/p&gt;
  839.  
  840.  
  841.  
  842. &lt;p&gt;&lt;strong&gt;Defining and identifying services. &lt;/strong&gt;Informed decision-making requires understanding the depth and breadth of library activities. One institution created a comprehensive “service catalog” through a collaborative process to identify all services, including legacy offerings and those duplicated across departments. This approach reveals services they “don’t realize that we are doing” and serves as a “game changer” for the organization, supporting the library as it undergoes a significant restructuring to refocus services. Some institutions are working to clearly define what constitutes a “core service,” though this presents challenges as disagreement emerges about priorities.&lt;/p&gt;
  843.  
  844.  
  845.  
  846. &lt;h3 class=&quot;wp-block-heading&quot;&gt;Operational adaptations&lt;/h3&gt;
  847.  
  848.  
  849.  
  850. &lt;p&gt;&lt;strong&gt;Flexible contracts and purchasing. &lt;/strong&gt;One institution is negotiating greater flexibility by shifting from multi-year to single-year purchasing contracts, requesting hardship clauses where single-year options aren’t available, and purchasing some resources outright instead of licensing to manage future uncertainty.&lt;/p&gt;
  851.  
  852.  
  853.  
  854. &lt;p&gt;&lt;strong&gt;“Good enough” service standards.&lt;/strong&gt; Participants agreed that libraries and archives cannot and should not continue to provide the same level of service with significantly reduced resources. Instead, one library is implementing a “good enough” approach that emphasizes sustainability over perfection.&lt;/p&gt;
  855.  
  856.  
  857.  
  858. &lt;p&gt;&lt;strong&gt;Strategic sunsetting of services.&lt;/strong&gt; Several institutions are examining services to discontinue, which can be challenging because staff are deeply invested in their work. Examples include ending conservation work on circulating collections after staff departures, sunsetting an under-resourced research information management system, and reallocating a data visualization position to geospatial support to fill strategic campus gaps.&lt;/p&gt;
  859.  
  860.  
  861.  
  862. &lt;h3 class=&quot;wp-block-heading&quot;&gt;Leveraging change&lt;/h3&gt;
  863.  
  864.  
  865.  
  866. &lt;p&gt;&lt;strong&gt;Crisis as opportunity&lt;/strong&gt;. A few participants view recent challenges—including the pandemic and staff losses—as opportunities for positive change. These circumstances can serve as a catalyst for workflow reconceptualization, service prioritization, and improved institutional alignment, with one participant saying, “never let a good crisis go to waste.”&lt;/p&gt;
  867.  
  868.  
  869.  
  870. &lt;p&gt;&lt;strong&gt;Visible service impacts. &lt;/strong&gt;Some leaders are deliberately making service reductions visible to demonstrate the real impact of budget cuts, rather than maintaining the illusion that operations can continue unchanged. “I want them to be visible. I don’t want to not be able to serve people and make it hard to use our collections, but at the same time, I don’t think I do anyone a service . . . by pretending that we can continue to [work as we always have].” This transparency helps stakeholders understand the consequences of funding decisions.&lt;/p&gt;
  871.  
  872.  
  873.  
  874. &lt;h3 class=&quot;wp-block-heading&quot;&gt;Supporting staff through change&lt;/h3&gt;
  875.  
  876.  
  877.  
  878. &lt;p&gt;&lt;strong&gt;Managing staff expectations and morale. &lt;/strong&gt;The identity-driven nature of archives and special collections work creates particular challenges when asking these staff to reduce their efforts. As one leader noted, “People draw a lot of identity from their work” which “makes asking them to scale back even harder.” Leaders are helping staff adjust expectations to match current realities to identify what can be paused, as well as working on realistic individual goal planning.&lt;/p&gt;
  879.  
  880.  
  881.  
  882. &lt;p&gt;&lt;strong&gt;Boundaries and organizational support&lt;/strong&gt;. Participants emphasized the need to ensure that service reduction decisions are supported throughout the organizational hierarchy, offering frontline staff the agency to set boundaries with users.&lt;/p&gt;
  883.  
  884.  
  885.  
  886. &lt;h2 class=&quot;wp-block-heading&quot;&gt;Looking ahead&lt;/h2&gt;
  887.  
  888.  
  889.  
  890. &lt;p&gt;These discussions accentuated that effective leadership during challenging times requires both strategic operational thinking and genuine care for people. Participants shared experiences that showed not only the weight of their decisions but also the collective strength found in peer networks, transparent communication, and a focus on meaningful work. The strategies they explored—scenario planning, morale-building practices, visible service impacts—reflect their efforts to balance institutional priorities with the well-being of staff and the communities they serve. &lt;/p&gt;
  891.  
  892.  
  893.  
  894. &lt;p&gt;The strategies and insights shared by RLP affiliates offer support and resources for peers navigating similar challenges. As we continue to unpack their insights, the next post will delve deeper into practical approaches for fostering adaptive leadership and organizational resilience.  &lt;/p&gt;
  895.  
  896.  
  897.  
  898. &lt;h2 class=&quot;wp-block-heading&quot;&gt;Roundtable participants&lt;/h2&gt;
  899.  
  900.  
  901.  
  902. &lt;p&gt;For the &lt;strong&gt;special collections roundtable&lt;/strong&gt; in May 2025, 25 participants from 25 institutions attended:&lt;/p&gt;
  903.  
  904.  
  905.  
  906. &lt;figure class=&quot;wp-block-table&quot;&gt;&lt;table class=&quot;has-fixed-layout&quot;&gt;&lt;tbody&gt;&lt;tr&gt;&lt;td&gt;Clemson University&lt;/td&gt;&lt;td&gt;National Library of New Zealand&lt;/td&gt;&lt;td&gt;Harry Ransom Center (University of Texas at Austin)&lt;/td&gt;&lt;/tr&gt;&lt;tr&gt;&lt;td&gt;Cleveland Museum of Art&lt;/td&gt;&lt;td&gt;Smithsonian Libraries and Archives&lt;/td&gt;&lt;td&gt;University of Toronto&lt;/td&gt;&lt;/tr&gt;&lt;tr&gt;&lt;td&gt;Cornell University&lt;/td&gt;&lt;td&gt;The New School&lt;/td&gt;&lt;td&gt;University of Utah&lt;/td&gt;&lt;/tr&gt;&lt;tr&gt;&lt;td&gt;Emory University&lt;/td&gt;&lt;td&gt;University of Calgary &lt;/td&gt;&lt;td&gt;University of Washington&lt;/td&gt;&lt;/tr&gt;&lt;tr&gt;&lt;td&gt;George Washington University&lt;/td&gt;&lt;td&gt;University of Arizona&lt;/td&gt;&lt;td&gt;University of Nevada, Las Vegas (UNLV)&lt;/td&gt;&lt;/tr&gt;&lt;tr&gt;&lt;td&gt;Haverford College&lt;/td&gt;&lt;td&gt;University of Delaware&lt;/td&gt;&lt;td&gt;Vanderbilt University&lt;/td&gt;&lt;/tr&gt;&lt;tr&gt;&lt;td&gt;Monash University&lt;/td&gt;&lt;td&gt;University of Kansas&lt;/td&gt;&lt;td&gt;Virginia Tech&lt;/td&gt;&lt;/tr&gt;&lt;tr&gt;&lt;td&gt;Montana State University&lt;/td&gt;&lt;td&gt;University of Miami&lt;/td&gt;&lt;td&gt;&lt;/td&gt;&lt;/tr&gt;&lt;tr&gt;&lt;td&gt;National Library of Australia&lt;/td&gt;&lt;td&gt;University of Pittsburgh&lt;/td&gt;&lt;td&gt;&lt;/td&gt;&lt;/tr&gt;&lt;/tbody&gt;&lt;/table&gt;&lt;/figure&gt;
  907.  
  908.  
  909.  
  910. &lt;p&gt;For the &lt;strong&gt;research support roundtable&lt;/strong&gt; in June 2025, 25 individuals participated from 23 RLP institutions:&lt;/p&gt;
  911.  
  912.  
  913.  
  914. &lt;figure class=&quot;wp-block-table&quot;&gt;&lt;table class=&quot;has-fixed-layout&quot;&gt;&lt;tbody&gt;&lt;tr&gt;&lt;td&gt;Cold Spring Harbor Laboratory &lt;/td&gt;&lt;td&gt;Stony Brook University&lt;/td&gt;&lt;td&gt;University of Maryland&lt;/td&gt;&lt;/tr&gt;&lt;tr&gt;&lt;td&gt;Institute for Advanced Study &lt;/td&gt;&lt;td&gt;Syracuse University&lt;/td&gt;&lt;td&gt;University of Sydney&lt;/td&gt;&lt;/tr&gt;&lt;tr&gt;&lt;td&gt;Monash University &lt;/td&gt;&lt;td&gt;Tufts University&lt;/td&gt;&lt;td&gt;University of Tennessee, Knoxville&lt;/td&gt;&lt;/tr&gt;&lt;tr&gt;&lt;td&gt;New York University&lt;/td&gt;&lt;td&gt;University of Arizona&lt;/td&gt;&lt;td&gt;University of Texas at Austin&lt;/td&gt;&lt;/tr&gt;&lt;tr&gt;&lt;td&gt;Ohio State University&lt;/td&gt;&lt;td&gt;University of California, Irvine&lt;/td&gt;&lt;td&gt;University of Toronto&lt;/td&gt;&lt;/tr&gt;&lt;tr&gt;&lt;td&gt;Penn State University&lt;/td&gt;&lt;td&gt;University of California, San Diego&lt;/td&gt;&lt;td&gt;University of Waterloo&lt;/td&gt;&lt;/tr&gt;&lt;tr&gt;&lt;td&gt;Rutgers University&lt;/td&gt;&lt;td&gt;University of Delaware&lt;/td&gt;&lt;td&gt;Utrecht University&lt;/td&gt;&lt;/tr&gt;&lt;tr&gt;&lt;td&gt;Smithsonian Libraries &amp;amp; Archives&lt;/td&gt;&lt;td&gt;University of Illinois Urbana-Champaign&lt;/td&gt;&lt;td&gt;&lt;/td&gt;&lt;/tr&gt;&lt;/tbody&gt;&lt;/table&gt;&lt;/figure&gt;
  915.  
  916.  
  917.  
  918. &lt;p&gt;&lt;em&gt;Sincere thanks for Mercy Procaccini and Erica Melko for their suggestions that improved these posts. &lt;/em&gt;&lt;/p&gt;
  919.  
  920.  
  921.  
  922. &lt;p class=&quot;has-small-font-size&quot;&gt;&lt;em&gt;AI Nota Bene: The co-authors leveraged AI in the writing of these blog posts. AI was used to identify key themes from notes and discussion transcripts, which was useful for developing the blog outline and some suggested quotes. We found AI useful for suggesting subheadings and more concise language. Nevertheless, we did the majority of the writing ourselves. &lt;/em&gt;&lt;/p&gt;
  923.  
  924.  
  925.  
  926. &lt;p&gt;&lt;/p&gt;
  927. &lt;p&gt;The post &lt;a href=&quot;https://hangingtogether.org/leading-through-uncertainty-fostering-morale-and-connection-in-challenging-times/&quot;&gt;Leading through uncertainty: Fostering morale and connection in challenging times &lt;/a&gt; appeared first on &lt;a href=&quot;https://hangingtogether.org&quot;&gt;Hanging Together&lt;/a&gt;.&lt;/p&gt;</content>
  928.      <author>
  929.  <name>Rebecca Bryant</name>
  930. <uri>https://hangingtogether.org/</uri>
  931.      </author>
  932.    </entry>
  933.  
  934.    <entry>
  935.      <title>In the Library, With the Lead Pipe: Towards a Librarianship of the Future: Fostering Cultural Adaptation to Climate Change</title>
  936.      <link rel="alternate" type="text/html" href="https://www.inthelibrarywiththeleadpipe.org/2025/cultural-adaptation/"/>
  937.      <id>https://www.inthelibrarywiththeleadpipe.org/?p=11870</id>
  938.      <updated>2025-06-25T15:30:34+00:00</updated>
  939.      <content type="html">&lt;h3 class=&quot;wp-block-heading&quot;&gt;&lt;strong&gt;In Brief&lt;/strong&gt;&lt;/h3&gt;
  940.  
  941.  
  942.  
  943. &lt;p&gt;&lt;em&gt;The field of library science is paying increasing attention to anthropogenic climate change by exploring best practices for mitigating damage from environmental disasters and participating in climate action. This work is valuable, but it does not necessarily take on the cultural dimensions of climate adaptation. How are unquestioned ideas about time and decay supporting the carbon-heavy preservation of archival materials? How can libraries promote interspecies kinship, consider the legacy of industrial colonialism, and acknowledge the emotional impact of environmental destruction? To approach these questions, this article introduces thinkers from the environmental humanities and Anthropocene scholarship and applies their work to the field of library science. It explores alternatives to linear concepts of time, affective materiality of archival objects, palliative death ethics, and Indigenous perspectives of climate change as the legacy of industrial colonialism. The article concludes by suggesting ways that institutions can promote cultural adaptation to climate change.&lt;/em&gt;&lt;/p&gt;
  944.  
  945.  
  946.  
  947. &lt;hr class=&quot;wp-block-separator has-alpha-channel-opacity&quot; /&gt;
  948.  
  949.  
  950.  
  951. &lt;h4 class=&quot;wp-block-heading&quot;&gt;By &lt;a href=&quot;https://www.inthelibrarywiththeleadpipe.org/author/nora-zahn/&quot;&gt;Nora Zahn&lt;/a&gt;&lt;/h4&gt;
  952.  
  953.  
  954.  
  955. &lt;h3 class=&quot;wp-block-heading&quot;&gt;&lt;strong&gt;Introduction&lt;/strong&gt;&lt;/h3&gt;
  956.  
  957.  
  958.  
  959. &lt;p&gt;Novelist Ursula K. Le Guin begins her 1985 epic, &lt;em&gt;Always Coming Home, &lt;/em&gt;with a brief chapter titled “Towards an Archaeology of the Future.”&lt;sup class=&quot;fn&quot;&gt;&lt;a href=&quot;http://www.inthelibrarywiththeleadpipe.org/feed/#b29f5d58-ce5a-4858-b332-8e422d3d67ac&quot; id=&quot;b29f5d58-ce5a-4858-b332-8e422d3d67ac-link&quot;&gt;1&lt;/a&gt;&lt;/sup&gt; The most likely intention is not to promote new standards to modernize the field of archaeology (“towards an archaeology &lt;em&gt;for &lt;/em&gt;the future”), nor is it to envision or influence what archaeological practice may look like decades or centuries from now (“towards a &lt;em&gt;future archaeology”&lt;/em&gt;). Rather, the book is a fictional ethnography of a people living in what we know as California’s Napa Valley thousands of years from the present day. What Le Guin likely means by the phrase is the most grammatically simple yet conceptually brain-melting interpretation: to replace the past with the future as the object of archaeological study. She is telling us to turn our heads and look in the opposite direction.&lt;/p&gt;
  960.  
  961.  
  962.  
  963. &lt;p&gt;The dissonance of studying the ruins of a society that doesn’t yet exist is apt for approaching the Anthropocene, the Earth’s current, unofficial geological epoch characterized by the ecosystem-altering impact of &lt;em&gt;Homo sapiens &lt;/em&gt;and resulting planetary disruptions.&lt;sup class=&quot;fn&quot;&gt;&lt;a href=&quot;http://www.inthelibrarywiththeleadpipe.org/feed/#de2ca204-cb78-42c1-b1e8-9e1aa1fe0dcc&quot; id=&quot;de2ca204-cb78-42c1-b1e8-9e1aa1fe0dcc-link&quot;&gt;2&lt;/a&gt;&lt;/sup&gt; Since the term was coined in the early 2000s, scholars from various academic disciplines have considered its implications as a framework that elevates human behavior to the level of worldshifting ecological events.&lt;sup class=&quot;fn&quot;&gt;&lt;a href=&quot;http://www.inthelibrarywiththeleadpipe.org/feed/#6d326017-fec3-4c28-8c79-a0648960dc80&quot; id=&quot;6d326017-fec3-4c28-8c79-a0648960dc80-link&quot;&gt;3&lt;/a&gt;&lt;/sup&gt; One such discipline is the environmental humanities, a field that emphasizes narrative and culture in approaching environmental challenges. Through an interdisciplinary lens combining the political, anthropological, literary, and/or philosophical, the field is united by the belief that humans and nature are intertwined. The environmental humanities welcomes traditionally excluded or undervalued perspectives into the discourse, with a particular openness to the voices of Indigenous scholars and activists. &lt;/p&gt;
  964.  
  965.  
  966.  
  967. &lt;p&gt;The field of library science is paying increasing attention to anthropogenic climate change. Librarians are exploring best practices for the logistical preparation of institutions for climate events like flooding and extreme heat&lt;sup class=&quot;fn&quot;&gt;&lt;a href=&quot;http://www.inthelibrarywiththeleadpipe.org/feed/#f6d09ba1-9719-4278-9812-3de999739923&quot; id=&quot;f6d09ba1-9719-4278-9812-3de999739923-link&quot;&gt;4&lt;/a&gt;&lt;/sup&gt;, advocating for reducing the environmental impact of industry-specific activities&lt;sup class=&quot;fn&quot;&gt;&lt;a href=&quot;http://www.inthelibrarywiththeleadpipe.org/feed/#a88b57b0-618a-4a1c-9a87-9bf6b1bf8d8f&quot; id=&quot;a88b57b0-618a-4a1c-9a87-9bf6b1bf8d8f-link&quot;&gt;5&lt;/a&gt;&lt;/sup&gt;, and discussing how libraries can promote and participate in climate action.&lt;sup class=&quot;fn&quot;&gt;&lt;a href=&quot;http://www.inthelibrarywiththeleadpipe.org/feed/#cd555baa-bf29-40bd-934c-74cb8df74272&quot; id=&quot;cd555baa-bf29-40bd-934c-74cb8df74272-link&quot;&gt;6&lt;/a&gt;&lt;/sup&gt; This work is valuable as institutions physically adapt to an age of instability. However, it does not necessarily take on the cultural dimensions of climate adaptation that library science, with its predilection for the past and general focus on our own species, may need some nudging to address.&lt;sup class=&quot;fn&quot;&gt;&lt;a href=&quot;http://www.inthelibrarywiththeleadpipe.org/feed/#9d6bfed7-4fb5-4ae2-ab79-ca97e50ba8c3&quot; id=&quot;9d6bfed7-4fb5-4ae2-ab79-ca97e50ba8c3-link&quot;&gt;7&lt;/a&gt;&lt;/sup&gt;&lt;/p&gt;
  968.  
  969.  
  970.  
  971. &lt;p&gt;How are unquestioned ideas about time and decay supporting the carbon-heavy preservation of archival materials? How can libraries consider the legacy of industrial colonialism, acknowledge the emotional impact of environmental destruction, and promote interspecies kinship? We need a librarianship of the future, in which librarians and archivists can build on theory from the environmental humanities to involve the public in thinking intergenerationally and imagining the possibilities for a world beyond the Anthropocene.&lt;/p&gt;
  972.  
  973.  
  974.  
  975. &lt;h3 class=&quot;wp-block-heading&quot;&gt;&lt;strong&gt;Time&lt;/strong&gt;&lt;/h3&gt;
  976.  
  977.  
  978.  
  979. &lt;p&gt;Alternative concepts of time that have emerged within the environmental humanities challenge the prevailing ethos that archives exist to preserve the past for humans of the future. When we consider those humans in the context of the climate crisis, we are faced with blunt uncertainty.&lt;sup class=&quot;fn&quot;&gt;&lt;a href=&quot;http://www.inthelibrarywiththeleadpipe.org/feed/#7b2200e1-59cc-466d-8d1e-1706b79980ce&quot; id=&quot;7b2200e1-59cc-466d-8d1e-1706b79980ce-link&quot;&gt;8&lt;/a&gt;&lt;/sup&gt;  As Erik Radio puts it, a “continued existence of life,” or “one in which humans continue to act in a central role” or are “even recognizable as ourselves,” is in question.&lt;sup class=&quot;fn&quot;&gt;&lt;a href=&quot;http://www.inthelibrarywiththeleadpipe.org/feed/#4c5922e4-9073-457d-aadc-2e72e0187e18&quot; id=&quot;4c5922e4-9073-457d-aadc-2e72e0187e18-link&quot;&gt;9&lt;/a&gt;&lt;/sup&gt; If the continuation of our species—either entirely, or with all the trappings of our current relationship with the planet—is not secure, then librarians must question the meaning of preserving materials for future humans who aren’t present or may not be able to access them.&lt;sup class=&quot;fn&quot;&gt;&lt;a href=&quot;http://www.inthelibrarywiththeleadpipe.org/feed/#b819b073-feef-4a95-9141-85fb2e4d0bb3&quot; id=&quot;b819b073-feef-4a95-9141-85fb2e4d0bb3-link&quot;&gt;10&lt;/a&gt;&lt;/sup&gt; And of course, questioning the value of an activity should prompt us to consider the cost: in this case, the carbon-intensive resources used to preserve these materials for an uncertain human future.&lt;/p&gt;
  980.  
  981.  
  982.  
  983. &lt;p&gt;&lt;em&gt;Deep time, &lt;/em&gt;which places humanity in the context of the total 4.54 billion years of Earth’s existence, proliferates Anthropocene scholarship. Geographer Kathryn Yusoff suggests that the Anthropocene framing offers a “new temporality”&lt;sup class=&quot;fn&quot;&gt;&lt;a href=&quot;http://www.inthelibrarywiththeleadpipe.org/feed/#fe40823f-feef-4366-bc73-a1a7ecff867f&quot; id=&quot;fe40823f-feef-4366-bc73-a1a7ecff867f-link&quot;&gt;11&lt;/a&gt;&lt;/sup&gt; by “embedd[ing]” our species in geologic time.&lt;sup class=&quot;fn&quot;&gt;&lt;a href=&quot;http://www.inthelibrarywiththeleadpipe.org/feed/#cd6ad65e-3e15-40b8-b51e-0e22f9778d05&quot; id=&quot;cd6ad65e-3e15-40b8-b51e-0e22f9778d05-link&quot;&gt;12&lt;/a&gt;&lt;/sup&gt; This gives humans geological agency by placing us among the few entities that “possess the power of extinction”&lt;sup class=&quot;fn&quot;&gt;&lt;a href=&quot;http://www.inthelibrarywiththeleadpipe.org/feed/#a5568431-b7a4-44e6-a9be-d391c0d28331&quot; id=&quot;a5568431-b7a4-44e6-a9be-d391c0d28331-link&quot;&gt;13&lt;/a&gt;&lt;/sup&gt; while “[shifting] the human timescale from biological life-course to that of epoch and species-life.” &lt;/p&gt;
  984.  
  985.  
  986.  
  987. &lt;p&gt;Establishing ourselves in this broader geologic context opens up imaginative doors. While discussing the causal role of fossil fuel extraction and use in facilitating species extinction, Yusoff notes that this way of positioning humans in time reveals our own potentially declining species as the “material expenditure of the remains of late capitalism.”  When we “[unearth] one fossil layer” to use as fuel for our machines, we “create another…that has our name on it.”&lt;sup class=&quot;fn&quot;&gt;&lt;a href=&quot;http://www.inthelibrarywiththeleadpipe.org/feed/#6a030430-c17f-45f6-aa2b-35e0fc29043e&quot; id=&quot;6a030430-c17f-45f6-aa2b-35e0fc29043e-link&quot;&gt;14&lt;/a&gt;&lt;/sup&gt; By materializing humanity as a future fossil layer, Yusoff expands our idea of what the history and record of our species could entail moving forward.&lt;sup class=&quot;fn&quot;&gt;&lt;a href=&quot;http://www.inthelibrarywiththeleadpipe.org/feed/#29d9016f-6338-4f89-985a-f09071284704&quot; id=&quot;29d9016f-6338-4f89-985a-f09071284704-link&quot;&gt;15&lt;/a&gt;&lt;/sup&gt; Radio gets at a similar notion in his effort to conceptualize archival documents in a posthuman world, suggesting that Earth is the “document that can serve as a contextual ground for all others,” and that therefore in the present libraries and archives should “strongly consider broadening their scope to incorporate the perspectives of geologists.”&lt;sup class=&quot;fn&quot;&gt;&lt;a href=&quot;http://www.inthelibrarywiththeleadpipe.org/feed/#1f5e66e8-c40b-4f38-963d-f2000d0f7968&quot; id=&quot;1f5e66e8-c40b-4f38-963d-f2000d0f7968-link&quot;&gt;16&lt;/a&gt;&lt;/sup&gt;&lt;/p&gt;
  988.  
  989.  
  990.  
  991. &lt;p&gt;If Yusoff and Radio use geology to detach from anthropocentric concepts of time, then an emphasis on interspecies relationships within the environmental humanities offers a different approach. Potawatomi scholar Kyle Powys Whyte uses various Indigenous perspectives to inform what he calls “kinship time,” in which “time is told through kinship relationships that entangle climate change with responsibility”&lt;sup class=&quot;fn&quot;&gt;&lt;a href=&quot;http://www.inthelibrarywiththeleadpipe.org/feed/#bb228b6f-e5e3-4cbf-906c-971487d24f00&quot; id=&quot;bb228b6f-e5e3-4cbf-906c-971487d24f00-link&quot;&gt;17&lt;/a&gt;&lt;/sup&gt; as an alternative to portraying climate change using the “ticking clock” narrative encouraged by the traditional Western, linear timescale.&lt;sup class=&quot;fn&quot;&gt;&lt;a href=&quot;http://www.inthelibrarywiththeleadpipe.org/feed/#ece36f26-df2b-4d59-8b0b-a8a537133b19&quot; id=&quot;ece36f26-df2b-4d59-8b0b-a8a537133b19-link&quot;&gt;18&lt;/a&gt;&lt;/sup&gt;&lt;/p&gt;
  992.  
  993.  
  994.  
  995. &lt;p&gt;Whyte references Samantha Chisholm Hatfield et al.’s research on the connection between time and ecosystems among tribes like the Siletz in the Pacific Northwest, who have historically relied on ecological signals to track the passage of time. For example, an elder interviewed by Hatfield explains how the Siletz used to rely on the emergence of carpenter ants in the spring as a seasonal cue to start hunting pacific lampreys, a coastal fish species resembling eels, despite the fact that the two species are not ecologically related.  These days, though, the elder says that the “weather’s changed so much that you can’t mark anything like that” anymore. This example represents a broader trend from Hatfield’s research: multispecies relationships like the ones between humans, ants, and lampreys demonstrate a foundational way of telling time and connecting to the universe for the Siletz. When climate change threatens the ecosystem, it jeopardizes that “sense of order”&lt;sup class=&quot;fn&quot;&gt;&lt;a href=&quot;http://www.inthelibrarywiththeleadpipe.org/feed/#df737817-c5bd-4f23-b492-faa73b772443&quot; id=&quot;df737817-c5bd-4f23-b492-faa73b772443-link&quot;&gt;19&lt;/a&gt;&lt;/sup&gt; and results in feelings of “abruptness and escalation.”&lt;sup class=&quot;fn&quot;&gt;&lt;a href=&quot;http://www.inthelibrarywiththeleadpipe.org/feed/#93cb57cc-8562-4d50-8f9f-ccb13e6dd5fa&quot; id=&quot;93cb57cc-8562-4d50-8f9f-ccb13e6dd5fa-link&quot;&gt;20&lt;/a&gt;&lt;/sup&gt;&lt;/p&gt;
  996.  
  997.  
  998.  
  999. &lt;p&gt;This way of telling time based on environmental cues is part of what Whyte (2018) refers to elsewhere in his research as a “seasonal round system” employed by peoples like the Anishinaabe/Neshnabé, in which public life is  “organized to change and shift throughout the year to adjust to the dynamics of ecosystems.”&lt;sup class=&quot;fn&quot;&gt;&lt;a href=&quot;http://www.inthelibrarywiththeleadpipe.org/feed/#3d47ba75-bd46-4865-b771-73c64bd685b2&quot; id=&quot;3d47ba75-bd46-4865-b771-73c64bd685b2-link&quot;&gt;21&lt;/a&gt;&lt;/sup&gt; He also describes “spiraling time,” which brings together ancestors and their descendants&lt;sup class=&quot;fn&quot;&gt;&lt;a href=&quot;http://www.inthelibrarywiththeleadpipe.org/feed/#1fcd47de-356e-4758-9ed0-50f529dfbfc9&quot; id=&quot;1fcd47de-356e-4758-9ed0-50f529dfbfc9-link&quot;&gt;22&lt;/a&gt;&lt;/sup&gt;, as well as Hatfield et al.’s explanation of time “based on a 3D construction” of ecological relationships.&lt;sup class=&quot;fn&quot;&gt;&lt;a href=&quot;http://www.inthelibrarywiththeleadpipe.org/feed/#c3654cb2-e56a-4762-8b5f-e260820201ed&quot; id=&quot;c3654cb2-e56a-4762-8b5f-e260820201ed-link&quot;&gt;23&lt;/a&gt;&lt;/sup&gt; By using these visual metaphors, Whyte seeks to reject linear time in favor of worldviews prizing interconnectedness and responsibility.&lt;/p&gt;
  1000.  
  1001.  
  1002.  
  1003. &lt;p&gt;Le Guin sought to do the same in her fictional ethnography&lt;em&gt; Always Coming Home.&lt;/em&gt; The subjects of the book, far-future residents of California’s Napa Valley who are known as the Kesh, see all living things as people, and therefore “no distinction is made between human and natural history.”&lt;sup class=&quot;fn&quot;&gt;&lt;a href=&quot;http://www.inthelibrarywiththeleadpipe.org/feed/#a4cb055d-fd4d-4317-ba83-38294effefef&quot; id=&quot;a4cb055d-fd4d-4317-ba83-38294effefef-link&quot;&gt;24&lt;/a&gt;&lt;/sup&gt; They conceive of time as less structured than Le Guin’s (likely Western) reader does; the ethnographer informs us that “time and space are so muddled together that one is never sure whether [the Kesh] are talking about an era or an area.”&lt;sup class=&quot;fn&quot;&gt;&lt;a href=&quot;http://www.inthelibrarywiththeleadpipe.org/feed/#01378c31-bdac-41d6-bebd-51913591a222&quot; id=&quot;01378c31-bdac-41d6-bebd-51913591a222-link&quot;&gt;25&lt;/a&gt;&lt;/sup&gt; When the ethnographer tries to ask a Kesh librarian-historian of sorts for information about previous inhabitants of the region, it initiates a roundabout discussion that ultimately leads the ethnographer to conclude,&lt;/p&gt;
  1004.  
  1005.  
  1006.  
  1007. &lt;blockquote class=&quot;wp-block-quote is-layout-flow wp-block-quote-is-layout-flow&quot;&gt;
  1008. &lt;p&gt;&lt;em&gt;It’s hopeless. He doesn’t perceive time as a direction, let alone a progress, but as a landscape in which one may go any number of directions, or nowhere. He spatialises time; it is not an arrow, nor a river, but a house, the house he lives in. One may go from room to room, and come back; to go outside, all you have to do is open the door.&lt;/em&gt;&lt;sup class=&quot;fn&quot;&gt;&lt;a href=&quot;http://www.inthelibrarywiththeleadpipe.org/feed/#fd563464-b29f-4cdb-9eef-1dad3431b3dc&quot; id=&quot;fd563464-b29f-4cdb-9eef-1dad3431b3dc-link&quot;&gt;26&lt;/a&gt;&lt;/sup&gt;&lt;/p&gt;
  1009. &lt;/blockquote&gt;
  1010.  
  1011.  
  1012.  
  1013. &lt;p&gt;This spatial conceptualization of time is directly tied to the Kesh’s assumptions of interspecies kinship. Their world is one in which human people live in balance, respect, and partnership with all others. Therefore they refer to us—that is, industrial Western civilization in the present day, which, per our linear perspective of time, existed long ago in their past and was ultimately destroyed by an intervening ecological apocalypse—as “the time outside,” and “when they lived outside the world.”&lt;sup class=&quot;fn&quot;&gt;&lt;a href=&quot;http://www.inthelibrarywiththeleadpipe.org/feed/#fd18b95b-ab3c-47fb-9ae9-ba0be694005a&quot; id=&quot;fd18b95b-ab3c-47fb-9ae9-ba0be694005a-link&quot;&gt;27&lt;/a&gt;&lt;/sup&gt;&lt;/p&gt;
  1014.  
  1015.  
  1016.  
  1017. &lt;p&gt;If people of all species who exist in other millennia are just across the threshold, how does it change the way we connect to the earth and other beings? And what does it mean for a field like library science that is paradoxically oriented toward both past and future, where materials from the past are collected and preserved for the benefit of future generations? Is it possible to extract library science from linear time, like one might extract a glittering stone from a running stream?&lt;/p&gt;
  1018.  
  1019.  
  1020.  
  1021. &lt;p&gt;Removing distance from our sense of time has some interesting implications for collections. Within the past-orientation, if people from many years ago are a short distance away, then age is less of a factor in deciding which of their objects or papers are worth preserving. If a British colonist named Gilberrt Bant living in 1690s Boston, for example, is just across the threshold then how important is it to keep a &lt;a href=&quot;https://hollisarchives.lib.harvard.edu/repositories/11/archival_objects/2692089&quot; rel=&quot;noreferrer noopener&quot; target=&quot;_blank&quot;&gt;receipt&lt;/a&gt; from when his father left him money? &lt;/p&gt;
  1022.  
  1023.  
  1024.  
  1025. &lt;p&gt;Within the future-orientation, on the other hand, if people from later epochs are on the other side of the door, then it may incentivize concern for future beings in how we choose to preserve materials and consume resources. It’s the proximity principle: if we can see our descendants right there in our front yard, we are more likely to consider them as we choose what and how we collect and preserve.&lt;sup class=&quot;fn&quot;&gt;&lt;a href=&quot;http://www.inthelibrarywiththeleadpipe.org/feed/#3f47cc57-172b-4480-84ee-bcba0a176c75&quot; id=&quot;3f47cc57-172b-4480-84ee-bcba0a176c75-link&quot;&gt;28&lt;/a&gt;&lt;/sup&gt;&lt;/p&gt;
  1026.  
  1027.  
  1028.  
  1029. &lt;p&gt;Clearly, this extreme shift in perspective requires intergenerational thinking. Kyle Whyte’s spiraling time encourages us to “consider ourselves as living alongside future and past relatives simultaneously” throughout our lives.&lt;sup class=&quot;fn&quot;&gt;&lt;a href=&quot;http://www.inthelibrarywiththeleadpipe.org/feed/#b1f8a53b-755a-4984-a80b-3ada2cfeef80&quot; id=&quot;b1f8a53b-755a-4984-a80b-3ada2cfeef80-link&quot;&gt;29&lt;/a&gt;&lt;/sup&gt; Philosopher Julia Gibson takes it a step further in their discussion of palliative death ethics for climate change, suggesting that past and future life does not just surround us, but in fact, is present &lt;em&gt;within&lt;/em&gt; us as individuals. This is because “the dead, the dying, the living, and those yet-to-be are not only distinct generations of beings along a linear sequence but coexistent facets of every being.” That is, outside of linear time, an individual occupies the past, present, and future all at once. As such, Gibson calls for rejecting the “‘us in the present’ vs. ‘them in the future’” dichotomy in climate change discourses.&lt;sup class=&quot;fn&quot;&gt;&lt;a href=&quot;http://www.inthelibrarywiththeleadpipe.org/feed/#be15f66c-e793-4dd0-b17d-63ae83967179&quot; id=&quot;be15f66c-e793-4dd0-b17d-63ae83967179-link&quot;&gt;30&lt;/a&gt;&lt;/sup&gt;&lt;/p&gt;
  1030.  
  1031.  
  1032.  
  1033. &lt;p&gt;Intergenerational thinking gives librarians the opportunity to consider our entire way of determining the value of objects we collect and preserve. Is constantly cooling and dehumidifying the room where we store Gilberrt Bant’s receipt worth releasing the carbon that will threaten our descendants’—or our own—access to sufficient food, or livable temperatures, or safety from mega-storms? Is it worth contributing to the ongoing mass extinction of other species?&lt;/p&gt;
  1034.  
  1035.  
  1036.  
  1037. &lt;h3 class=&quot;wp-block-heading&quot;&gt;&lt;strong&gt;Mortality&lt;/strong&gt;&lt;/h3&gt;
  1038.  
  1039.  
  1040.  
  1041. &lt;p&gt;Let’s consider the objects themselves. Archivist Marika Cifor calls on the rise of new materialism in recent feminist theory to inform what she terms the “liveliness” of physical matter in an archival setting. Building on Karen Barad’s “entanglement of matter and meaning,”&lt;sup class=&quot;fn&quot;&gt;&lt;a href=&quot;http://www.inthelibrarywiththeleadpipe.org/feed/#876d2761-d11e-4abb-8053-42016c969dc2&quot; id=&quot;876d2761-d11e-4abb-8053-42016c969dc2-link&quot;&gt;31&lt;/a&gt;&lt;/sup&gt; Cifor writes that matter is “animate and imbued with a particular kind of agential and affective vitality.”&lt;sup class=&quot;fn&quot;&gt;&lt;a href=&quot;http://www.inthelibrarywiththeleadpipe.org/feed/#9962695d-7d30-453f-b724-0218edf3cf27&quot; id=&quot;9962695d-7d30-453f-b724-0218edf3cf27-link&quot;&gt;32&lt;/a&gt;&lt;/sup&gt; To her, “humans and objects are fundamentally and crucially interrelated” in a way that cannot be fully expressed with words,  giving objects a “materiality” that “resists language.”&lt;sup class=&quot;fn&quot;&gt;&lt;a href=&quot;http://www.inthelibrarywiththeleadpipe.org/feed/#8ffb17e1-d58c-4811-abac-15ba85ff8a10&quot; id=&quot;8ffb17e1-d58c-4811-abac-15ba85ff8a10-link&quot;&gt;33&lt;/a&gt;&lt;/sup&gt;&lt;/p&gt;
  1042.  
  1043.  
  1044.  
  1045. &lt;p&gt;Cifor experiences this first-hand in the archives when she encounters the bloodstained clothing that gay rights activist Harvey Milk was wearing when he was assassinated in 1978. Touching the clothes and their stains gives her an “intimate experience of horror in the archives that blatantly refuses intellectualisation” and makes her understand Milk in an entirely new way.&lt;sup class=&quot;fn&quot;&gt;&lt;a href=&quot;http://www.inthelibrarywiththeleadpipe.org/feed/#58abf47b-c09c-4d56-a7b6-9b543f71e633&quot; id=&quot;58abf47b-c09c-4d56-a7b6-9b543f71e633-link&quot;&gt;34&lt;/a&gt;&lt;/sup&gt; The emotionally affective capacity of archival objects isn’t limited to horror, though. Cifor also visits Yale University’s Beinecke Special Collections Library to encounter the personal “Stud File” of Samuel Steward, a prominent gay figure from the twentieth century. She describes the feeling of disgust, the “messy space between desire and repulsion,” she experienced when she found pubic hairs taped to a couple of records in Steward’s homemade card catalog of sexual partners.&lt;sup class=&quot;fn&quot;&gt;&lt;a href=&quot;http://www.inthelibrarywiththeleadpipe.org/feed/#5515a0c1-980e-451a-8a1c-1c4f49ab40a0&quot; id=&quot;5515a0c1-980e-451a-8a1c-1c4f49ab40a0-link&quot;&gt;35&lt;/a&gt;&lt;/sup&gt;&lt;/p&gt;
  1046.  
  1047.  
  1048.  
  1049. &lt;p&gt;Of course, as most librarians can attest, archival objects can also invoke less complicated feelings like awe. This is part of our motivation for preserving items like Bant’s receipt, or a 1796 &lt;a href=&quot;https://hollisarchives.lib.harvard.edu/repositories/11/archival_objects/2692067&quot; rel=&quot;noreferrer noopener&quot; target=&quot;_blank&quot;&gt;Irish state lottery ticket&lt;/a&gt;, or a &lt;a href=&quot;https://cdli.mpiwg-berlin.mpg.de/artifacts/235123&quot; rel=&quot;noreferrer noopener&quot; target=&quot;_blank&quot;&gt;Neo-Babylonian cuneiform tablet&lt;/a&gt; from circa 626-539 BCE documenting the receipt of money and barley. Beyond their functional context or intended purpose, we keep them available for the pleasure of being in proximity to objects that are old. But preserving items because of their age ultimately undermines their liveliness and agency. This is because, per Cifor, archives are “in a state of constant flux, shifting with each new intra-action of the various and changing actors that constitute it.”&lt;sup class=&quot;fn&quot;&gt;&lt;a href=&quot;http://www.inthelibrarywiththeleadpipe.org/feed/#0b2fd0e4-497e-469a-8c3c-52ba96189cf6&quot; id=&quot;0b2fd0e4-497e-469a-8c3c-52ba96189cf6-link&quot;&gt;36&lt;/a&gt;&lt;/sup&gt;&lt;/p&gt;
  1050.  
  1051.  
  1052.  
  1053. &lt;p&gt;Dani Stuchel discusses this in the context of physical, non-human actors like sunlight and moisture that interact with archival materials on a chemical level to prompt their gradual deterioration. Decay is an “expression” of archival entropy—the state of constant, ongoing movement and change that applies to all objects—because it “reveals aspects of materiality not visible in earlier states” as it goes on.&lt;sup class=&quot;fn&quot;&gt;&lt;a href=&quot;http://www.inthelibrarywiththeleadpipe.org/feed/#d335d4c7-3c32-49b0-b5ee-cd81bf335a37&quot; id=&quot;d335d4c7-3c32-49b0-b5ee-cd81bf335a37-link&quot;&gt;37&lt;/a&gt;&lt;/sup&gt;&lt;/p&gt;
  1054.  
  1055.  
  1056.  
  1057. &lt;p&gt;Traditional preservation attempts to pause this process, leading to what, to me, is one of the many intriguing contradictions arising from this line of research: the attempt to delay the inevitable by keeping an object “alive” imposes stasis, while the process toward death indicates liveliness and vibrancy. Stuchel, though, makes a different and more nuanced point. Rather than decay implying “action” and preservation implying inaction, Stuchel argues that action takes place regardless:&lt;/p&gt;
  1058.  
  1059.  
  1060.  
  1061. &lt;blockquote class=&quot;wp-block-quote is-layout-flow wp-block-quote-is-layout-flow&quot;&gt;
  1062. &lt;p&gt;&lt;em&gt;[N]o archival [material] stays ‘pristine’ on its own: the appearance of stasis or sameness is continually constructed through archival action. Alongside these actions, we continually re-accept the renewed thing as ‘original’ or ‘authentic.’ The postmodern refrain of performativity echoes on here: an archival thing is constantly becoming an archival thing.&lt;/em&gt;&lt;sup class=&quot;fn&quot;&gt;&lt;a href=&quot;http://www.inthelibrarywiththeleadpipe.org/feed/#cd783224-9811-4340-82cf-89a338dd81b7&quot; id=&quot;cd783224-9811-4340-82cf-89a338dd81b7-link&quot;&gt;38&lt;/a&gt;&lt;/sup&gt;&lt;/p&gt;
  1063. &lt;/blockquote&gt;
  1064.  
  1065.  
  1066.  
  1067. &lt;p&gt;If archival materials can make us feel things and are constantly changing, then perhaps letting them change can help us emotionally grapple with broader instability. Stuchel calls for “introspection” on “the psychological impulses which drive our need to keep, evince, retrieve, and preserve,”&lt;sup class=&quot;fn&quot;&gt;&lt;a href=&quot;http://www.inthelibrarywiththeleadpipe.org/feed/#4afabc36-251a-4e9e-af73-e3d0409cea93&quot; id=&quot;4afabc36-251a-4e9e-af73-e3d0409cea93-link&quot;&gt;39&lt;/a&gt;&lt;/sup&gt; suggesting that a fear of letting objects decay—and therefore “forgetting” them from the cultural memory—indicates a broader fear of our own death.&lt;sup class=&quot;fn&quot;&gt;&lt;a href=&quot;http://www.inthelibrarywiththeleadpipe.org/feed/#75163e1e-cddd-44d3-8ff9-0ff420994ee6&quot; id=&quot;75163e1e-cddd-44d3-8ff9-0ff420994ee6-link&quot;&gt;40&lt;/a&gt;&lt;/sup&gt; Le Guin suggests the same via her narrator in &lt;em&gt;Always Coming Home&lt;/em&gt;:&lt;/p&gt;
  1068.  
  1069.  
  1070.  
  1071. &lt;blockquote class=&quot;wp-block-quote is-layout-flow wp-block-quote-is-layout-flow&quot;&gt;
  1072. &lt;p&gt;&lt;em&gt;Perhaps not many of us could say why we save so many words, why our forests must all be cut to make paper to mark our words on, our rivers dammed to make electricity to power our word processors; we do it obsessively, as if afraid of something, as if compensating for something. Maybe we’re afraid of death, afraid to let our words simply be spoken and die, leaving silence for new words to be born in.&lt;/em&gt;&lt;sup class=&quot;fn&quot;&gt;&lt;a href=&quot;http://www.inthelibrarywiththeleadpipe.org/feed/#9de0c803-0e87-43da-898f-36ce4776f32d&quot; id=&quot;9de0c803-0e87-43da-898f-36ce4776f32d-link&quot;&gt;41&lt;/a&gt;&lt;/sup&gt;&lt;/p&gt;
  1073. &lt;/blockquote&gt;
  1074.  
  1075.  
  1076.  
  1077. &lt;p&gt;What happens emotionally if we let our resistance to archival entropy soften? Stuchel points to “very human feelings”&lt;sup class=&quot;fn&quot;&gt;&lt;a href=&quot;http://www.inthelibrarywiththeleadpipe.org/feed/#63c49b3d-2880-4959-883a-8440f26050e8&quot; id=&quot;63c49b3d-2880-4959-883a-8440f26050e8-link&quot;&gt;42&lt;/a&gt;&lt;/sup&gt; that we miss when we reject the possibility of decay, since we can “mediate our grief about and connection to the past” through these “transitional or memorial objects.”&lt;sup class=&quot;fn&quot;&gt;&lt;a href=&quot;http://www.inthelibrarywiththeleadpipe.org/feed/#ebc2cf24-f36e-4325-ab70-0ad7b0d9c92b&quot; id=&quot;ebc2cf24-f36e-4325-ab70-0ad7b0d9c92b-link&quot;&gt;43&lt;/a&gt;&lt;/sup&gt; When the archival object represents a way of life that climate change may take away, witnessing its death gives us an opportunity to engage with a broader mourning process.&lt;sup class=&quot;fn&quot;&gt;&lt;a href=&quot;http://www.inthelibrarywiththeleadpipe.org/feed/#3f857653-4b35-40cc-8a14-53c97ec8c6d3&quot; id=&quot;3f857653-4b35-40cc-8a14-53c97ec8c6d3-link&quot;&gt;44&lt;/a&gt;&lt;/sup&gt;&lt;/p&gt;
  1078.  
  1079.  
  1080.  
  1081. &lt;p&gt;And in a culture whose mainstream discourse refuses to explore collective acceptance of change or publicly grieve the ongoing losses wrought by environmental destruction, engagement with these feelings is vital. In Gibson’s palliative death ethics for climate change and extinction, simply not looking away is critical. We must “[refuse] to neglect [our] ecological partners even as they are…leaving us.”&lt;sup class=&quot;fn&quot;&gt;&lt;a href=&quot;http://www.inthelibrarywiththeleadpipe.org/feed/#fcf30ac4-d70a-4200-bc12-3f01e2bcd030&quot; id=&quot;fcf30ac4-d70a-4200-bc12-3f01e2bcd030-link&quot;&gt;45&lt;/a&gt;&lt;/sup&gt; Giving up the illusion of control is key here. As Cifor notes on materiality in archives, “affect disrupts the notion of an intentional and agential human subject in full control of the matter with which they engage.”&lt;sup class=&quot;fn&quot;&gt;&lt;a href=&quot;http://www.inthelibrarywiththeleadpipe.org/feed/#3bc7a048-659a-4b56-8a02-b41a8251b4cd&quot; id=&quot;3bc7a048-659a-4b56-8a02-b41a8251b4cd-link&quot;&gt;46&lt;/a&gt;&lt;/sup&gt; This is true of both archival objects and our planet in the Anthropocene.&lt;/p&gt;
  1082.  
  1083.  
  1084.  
  1085. &lt;h3 class=&quot;wp-block-heading&quot;&gt;&lt;strong&gt;Finality&lt;/strong&gt;&lt;/h3&gt;
  1086.  
  1087.  
  1088.  
  1089. &lt;p&gt;It is difficult, to say the least, to understand how the effects of anthropogenic climate change will play out and to glimpse possible futures as a result. This is due in part to what Rick Crownshaw calls the Anthropocene’s “problems of scale,” which he writes are “becoming something of a mantra” in memory studies in particular and the humanities and social sciences more broadly.&lt;sup class=&quot;fn&quot;&gt;&lt;a href=&quot;http://www.inthelibrarywiththeleadpipe.org/feed/#8ba35c4b-97bf-4163-a65d-96b3ad2bdb9c&quot; id=&quot;8ba35c4b-97bf-4163-a65d-96b3ad2bdb9c-link&quot;&gt;47&lt;/a&gt;&lt;/sup&gt; Because it is “unfolding unevenly across time and space, matter and life (human and nonhuman), and through planetary systems and processes (engendering systemic feedback loops and crossing the threshold of systemic tipping points),” the Anthropocene’s effects are only discernible “through a ‘derangement’…of the scales of cognition, remembrance, and representation.”&lt;sup class=&quot;fn&quot;&gt;&lt;a href=&quot;http://www.inthelibrarywiththeleadpipe.org/feed/#0d7fb826-2191-480e-b543-65cd831cd748&quot; id=&quot;0d7fb826-2191-480e-b543-65cd831cd748-link&quot;&gt;48&lt;/a&gt;&lt;/sup&gt; He considers the “‘humanist enclosures’ of cultural memory studies” to be “ill equipped” to manage these various scales.&lt;sup class=&quot;fn&quot;&gt;&lt;a href=&quot;http://www.inthelibrarywiththeleadpipe.org/feed/#d0f0fa43-4079-4746-bf2d-6e0554440c69&quot; id=&quot;d0f0fa43-4079-4746-bf2d-6e0554440c69-link&quot;&gt;49&lt;/a&gt;&lt;/sup&gt;&lt;/p&gt;
  1090.  
  1091.  
  1092.  
  1093. &lt;p&gt;This dissonance is akin to the challenge of thinking intergenerationally when a person has only ever perceived oneself as existing within linear time. It is at play in efforts to conceive of archives in a future that may lack human cultural institutions as we know them today.&lt;sup class=&quot;fn&quot;&gt;&lt;a href=&quot;http://www.inthelibrarywiththeleadpipe.org/feed/#d431a863-2629-4aad-b6c2-2c3e8c81b55f&quot; id=&quot;d431a863-2629-4aad-b6c2-2c3e8c81b55f-link&quot;&gt;50&lt;/a&gt;&lt;/sup&gt; It is the grand contradiction, daunting obstacle, and essential task of &lt;em&gt;anticipatory memory&lt;/em&gt;, in which we attempt to envision the conditions of future beings in order to join them in retroactively observing our own time and remembering the critical period that took place between our present and theirs.&lt;sup class=&quot;fn&quot;&gt;&lt;a href=&quot;http://www.inthelibrarywiththeleadpipe.org/feed/#311fd9a5-33fb-4cd0-80da-2b0b4766acd8&quot; id=&quot;311fd9a5-33fb-4cd0-80da-2b0b4766acd8-link&quot;&gt;51&lt;/a&gt;&lt;/sup&gt; That period is wrapped up, perhaps, with survival.&lt;/p&gt;
  1094.  
  1095.  
  1096.  
  1097. &lt;p&gt;Of course, survival can be difficult to imagine when reports about &lt;a href=&quot;https://library.wmo.int/records/item/68835-state-of-the-global-climate-2023&quot; rel=&quot;noreferrer noopener&quot; target=&quot;_blank&quot;&gt;record temperatures&lt;/a&gt; and &lt;a href=&quot;https://www.aljazeera.com/news/2023/3/20/point-of-no-return-un-report-to-provide-stark-climate-warning&quot; rel=&quot;noreferrer noopener&quot; target=&quot;_blank&quot;&gt;points of no return&lt;/a&gt; stir up an anxious urgency that collides with broadly avoidant apathy&lt;sup class=&quot;fn&quot;&gt;&lt;a href=&quot;http://www.inthelibrarywiththeleadpipe.org/feed/#ccecd455-e21e-4023-83dc-3191012c4580&quot; id=&quot;ccecd455-e21e-4023-83dc-3191012c4580-link&quot;&gt;52&lt;/a&gt;&lt;/sup&gt; and &lt;a href=&quot;https://www.eenews.net/articles/trump-fired-the-heat-experts-now-he-might-kill-their-heat-rule/&quot; rel=&quot;noreferrer noopener&quot; target=&quot;_blank&quot;&gt;overwhelming inadequacy at the highest political levels&lt;/a&gt;. The outlook can seem rather bleak. However, as Gibson cautions us, there is a difference between “attending centrally to grief and death” and “becoming fatalistic,”&lt;sup class=&quot;fn&quot;&gt;&lt;a href=&quot;http://www.inthelibrarywiththeleadpipe.org/feed/#fa54bfb2-5a9a-4baa-bef8-8d5569ab6cb2&quot; id=&quot;fa54bfb2-5a9a-4baa-bef8-8d5569ab6cb2-link&quot;&gt;53&lt;/a&gt;&lt;/sup&gt; and it is important to consider the implications of succumbing to the latter.&lt;/p&gt;
  1098.  
  1099.  
  1100.  
  1101. &lt;p&gt;Whyte offers a useful perspective of survival as he considers dominant Western state-of-emergency or apocalypse narratives of climate change. From the perspective of Indigenous peoples, an apocalypse has already taken place. They have already experienced what the rest of us dread most about climate change—“ecosystem collapse, species loss, economic crash, drastic relocation, and cultural disintegration”—and are currently living in their ancestors’ version of a dystopia.&lt;sup class=&quot;fn&quot;&gt;&lt;a href=&quot;http://www.inthelibrarywiththeleadpipe.org/feed/#45c4ea2f-5485-4f4b-b562-03fe5b6dff99&quot; id=&quot;45c4ea2f-5485-4f4b-b562-03fe5b6dff99-link&quot;&gt;54&lt;/a&gt;&lt;/sup&gt; Therefore, he argues, the state-of-emergency narrative is flawed because it veils the dominant culture’s continued benefit from colonization&lt;sup class=&quot;fn&quot;&gt;&lt;a href=&quot;http://www.inthelibrarywiththeleadpipe.org/feed/#d7b1b8a2-e6af-4efa-aa01-afbd1844a4d9&quot; id=&quot;d7b1b8a2-e6af-4efa-aa01-afbd1844a4d9-link&quot;&gt;55&lt;/a&gt;&lt;/sup&gt;, promotes a hasty reliance on solutions from the systems of irresponsibility that got us here in the first place and favor protecting the status quo&lt;sup class=&quot;fn&quot;&gt;&lt;a href=&quot;http://www.inthelibrarywiththeleadpipe.org/feed/#ca936b9c-8552-4f8b-9c5a-5ac4868a58b9&quot; id=&quot;ca936b9c-8552-4f8b-9c5a-5ac4868a58b9-link&quot;&gt;56&lt;/a&gt;&lt;/sup&gt;, and prevents humans from slowing down and prioritizing intra- and interspecies kinship in the process.&lt;sup class=&quot;fn&quot;&gt;&lt;a href=&quot;http://www.inthelibrarywiththeleadpipe.org/feed/#914b238e-6a72-46a2-a39f-7d4bf8b9a8de&quot; id=&quot;914b238e-6a72-46a2-a39f-7d4bf8b9a8de-link&quot;&gt;57&lt;/a&gt;&lt;/sup&gt;&lt;/p&gt;
  1102.  
  1103.  
  1104.  
  1105. &lt;p&gt;Whyte highlights scholars Heather Davis and Zoe Todd’s view of industrial colonialism as a “seismic shockwave” that “compact[s] and speed[s] up time, laying waste to legal orders, languages and place-story in quick succession.”&lt;sup class=&quot;fn&quot;&gt;&lt;a href=&quot;http://www.inthelibrarywiththeleadpipe.org/feed/#515ff931-2ed2-4895-a5a1-ad3cb6797292&quot; id=&quot;515ff931-2ed2-4895-a5a1-ad3cb6797292-link&quot;&gt;58&lt;/a&gt;&lt;/sup&gt; It caused the “violent, fleshy loss of 50 million Indigenous peoples in the Americas,” and now via climate change its “reverberations” are starting to impact the imperial nations that first started it.&lt;sup class=&quot;fn&quot;&gt;&lt;a href=&quot;http://www.inthelibrarywiththeleadpipe.org/feed/#37a77ac7-6f7c-47ee-80c0-6581e01da0bd&quot; id=&quot;37a77ac7-6f7c-47ee-80c0-6581e01da0bd-link&quot;&gt;59&lt;/a&gt;&lt;/sup&gt; He urges readers to resist general narratives of “finality” and “lastness” put forth by non-Native climate advocates that conceptualize change as “describing movement or transition from stability to crisis—where crisis signals an impending end.”&lt;sup class=&quot;fn&quot;&gt;&lt;a href=&quot;http://www.inthelibrarywiththeleadpipe.org/feed/#9186e44d-7e9e-49ef-94c4-aae1dfe381f0&quot; id=&quot;9186e44d-7e9e-49ef-94c4-aae1dfe381f0-link&quot;&gt;60&lt;/a&gt;&lt;/sup&gt; That’s because such narratives discount the already existing apocalyptic circumstances of Indigenous communities under settler colonialism explained above.&lt;/p&gt;
  1106.  
  1107.  
  1108.  
  1109. &lt;p&gt;Whyte also argues that those portrayals belie the fact that, as uncomfortable as it is for these non-Native allies to admit, many are living in their own ancestors’ version of a fantasy. They are benefitting from both the generations-long destruction of Indigenous ways of life and the legal and moral justification of that destruction, while framing themselves as “protagonists” who can “save Indigenous peoples” from a newly and urgently precarious position—which, Whyte notes, “their ancestors of course failed to do.”&lt;sup class=&quot;fn&quot;&gt;&lt;a href=&quot;http://www.inthelibrarywiththeleadpipe.org/feed/#944922b1-6afe-42ad-b52f-f8905b783d5e&quot; id=&quot;944922b1-6afe-42ad-b52f-f8905b783d5e-link&quot;&gt;61&lt;/a&gt;&lt;/sup&gt; By neglecting to acknowledge the legacy of industrial colonialism when approaching anthropogenic climate change, Western systems of power risk perpetuating it further.&lt;/p&gt;
  1110.  
  1111.  
  1112.  
  1113. &lt;p&gt;A related critique that arises in environmental humanities scholarship takes issue with the Anthropocene’s implication that all human beings constitute a single, forceful entity. It may seem like a “neutral” concept, according to Kathryn Yusoff, but it is based on a politics that “universalises the inheritance and responsibility for fossil fuel consumption”&lt;sup class=&quot;fn&quot;&gt;&lt;a href=&quot;http://www.inthelibrarywiththeleadpipe.org/feed/#8d4bdc01-aba0-48fb-a156-98df9e71eaf1&quot; id=&quot;8d4bdc01-aba0-48fb-a156-98df9e71eaf1-link&quot;&gt;62&lt;/a&gt;&lt;/sup&gt; despite that consumption’s uneven distribution across the globe. Davis and Todd read the Anthropocene as an “extension and enactment of colonial logic [that] systemically erases difference,”&lt;sup class=&quot;fn&quot;&gt;&lt;a href=&quot;http://www.inthelibrarywiththeleadpipe.org/feed/#d6140629-cea4-4d6f-a667-d19efa75945c&quot; id=&quot;d6140629-cea4-4d6f-a667-d19efa75945c-link&quot;&gt;63&lt;/a&gt;&lt;/sup&gt; and they would rather acknowledge those varying levels of responsibility.&lt;sup class=&quot;fn&quot;&gt;&lt;a href=&quot;http://www.inthelibrarywiththeleadpipe.org/feed/#750e96a4-42e1-40cd-b753-434701637bd8&quot; id=&quot;750e96a4-42e1-40cd-b753-434701637bd8-link&quot;&gt;64&lt;/a&gt;&lt;/sup&gt; This narrative flattening of numerous human communities existing across a broad spectrum of emissions, consumption, and culpability for climatic effects is a concern that white Westerners like me should consider when writing about the Anthropocene and its impact.&lt;/p&gt;
  1114.  
  1115.  
  1116.  
  1117. &lt;p&gt;It may seem contradictory to reject finality as part of the Anthropocene narrative while advocating for palliative death ethics in our approach to climate change, so it is crucial to clarify the difference. Le Guin has an oft-quoted line: “We live in capitalism, its power seems inescapable—but then, so did the divine right of kings.”&lt;sup class=&quot;fn&quot;&gt;&lt;a href=&quot;http://www.inthelibrarywiththeleadpipe.org/feed/#f39a0994-7083-4ab1-aab7-946d93ccf0c7&quot; id=&quot;f39a0994-7083-4ab1-aab7-946d93ccf0c7-link&quot;&gt;65&lt;/a&gt;&lt;/sup&gt; &lt;em&gt;Late capitalism, &lt;/em&gt;a phrase that has become ubiquitous on the Left in recent years, echoes Le Guin’s hopefulness. Both seek to transcend the imaginative shortcomings of linear time imposed on those experiencing it by giving language to the eventual end of an entrenched system whose mythology relies on perpetuity and endless expansion.&lt;sup class=&quot;fn&quot;&gt;&lt;a href=&quot;http://www.inthelibrarywiththeleadpipe.org/feed/#13ed1670-b582-4979-a61f-590915c74a68&quot; id=&quot;13ed1670-b582-4979-a61f-590915c74a68-link&quot;&gt;66&lt;/a&gt;&lt;/sup&gt; &lt;/p&gt;
  1118.  
  1119.  
  1120.  
  1121. &lt;p&gt;For those trapped in a linear time perspective, on the other hand, there is nothing “late” about our experience of capitalism because there is no end in sight. Considering how climate change works, it might alternatively be reasonable to wonder if it is &lt;em&gt;too late, &lt;/em&gt;since we’re only now feeling the effects from carbon emissions decades ago, and the effects of today’s emissions won’t be felt for decades more. Too late for what, though? Perhaps it is too late for &lt;em&gt;Homo sapiens&lt;/em&gt;; or perhaps, as Yusoff posits, capitalism may be what goes extinct instead of our species.&lt;sup class=&quot;fn&quot;&gt;&lt;a href=&quot;http://www.inthelibrarywiththeleadpipe.org/feed/#85643f8c-0ba0-42b5-ae16-0a2e7d6d7b12&quot; id=&quot;85643f8c-0ba0-42b5-ae16-0a2e7d6d7b12-link&quot;&gt;67&lt;/a&gt;&lt;/sup&gt;&lt;/p&gt;
  1122.  
  1123.  
  1124.  
  1125. &lt;p&gt;This is the difference between mortality and finality. We can expect not to outlive capitalism while understanding that others will. We can acknowledge death intimately and locally while also thinking intergenerationally and acting with care for humans and other beings we may never meet. We can employ the “geographic imaginations”&lt;sup class=&quot;fn&quot;&gt;&lt;a href=&quot;http://www.inthelibrarywiththeleadpipe.org/feed/#1c96bfd6-2d51-4fa2-82fd-669986b48f69&quot; id=&quot;1c96bfd6-2d51-4fa2-82fd-669986b48f69-link&quot;&gt;68&lt;/a&gt;&lt;/sup&gt; enabled by the Anthropocene to try to take on its problems of scale. The end of one person’s, or one society’s, or one species’s way of life is not an end to all.&lt;/p&gt;
  1126.  
  1127.  
  1128.  
  1129. &lt;h3 class=&quot;wp-block-heading&quot;&gt;&lt;strong&gt;Remembrance and Responsibility&lt;/strong&gt;&lt;/h3&gt;
  1130.  
  1131.  
  1132.  
  1133. &lt;p&gt;Gibson emphasizes the act of remembrance as an “ongoing communal ethic” that “matters deeply” under conditions of environmental injustice.&lt;sup class=&quot;fn&quot;&gt;&lt;a href=&quot;http://www.inthelibrarywiththeleadpipe.org/feed/#c7bad781-8abd-4087-8a7a-2f6072605442&quot; id=&quot;c7bad781-8abd-4087-8a7a-2f6072605442-link&quot;&gt;69&lt;/a&gt;&lt;/sup&gt; As some of the cultural spaces best positioned to attend to public remembrance, institutions like libraries and archives must try. Luckily, library science’s broad range of involvement with the public provides diverse opportunities to engage with this work.&lt;/p&gt;
  1134.  
  1135.  
  1136.  
  1137. &lt;p&gt;For archives and special collections, Stuchel suggests that the ephemerality of cultural heritage objects can activate public care and remembrance. Stuchel advocates for geographer Caitlin DeSilvey’s concept of curated decay, in which an item’s “disintegration” is “incorporated into heritage practice.”&lt;sup class=&quot;fn&quot;&gt;&lt;a href=&quot;http://www.inthelibrarywiththeleadpipe.org/feed/#c6897d60-8258-4b81-8d36-4106c1ea92f2&quot; id=&quot;c6897d60-8258-4b81-8d36-4106c1ea92f2-link&quot;&gt;70&lt;/a&gt;&lt;/sup&gt; This approach lets us “work with the entire ‘lifecycle’ of a thing”&lt;sup class=&quot;fn&quot;&gt;&lt;a href=&quot;http://www.inthelibrarywiththeleadpipe.org/feed/#7e74d1a8-34f9-4d17-9ae8-64b32a0a1fad&quot; id=&quot;7e74d1a8-34f9-4d17-9ae8-64b32a0a1fad-link&quot;&gt;71&lt;/a&gt;&lt;/sup&gt; and consider its ecological context&lt;sup class=&quot;fn&quot;&gt;&lt;a href=&quot;http://www.inthelibrarywiththeleadpipe.org/feed/#280c5c3b-3abf-4467-9742-30743dbdb56a&quot; id=&quot;280c5c3b-3abf-4467-9742-30743dbdb56a-link&quot;&gt;72&lt;/a&gt;&lt;/sup&gt; as it disintegrates. Incorporating decay into the public’s experience of materials could achieve the emotional engagement with mortality described above while also fulfilling what, according to Samantha Winn, is “both an ethical imperative and a functional exigency” of archivists to “develop practices which do not require infinite exploitable resources.”&lt;sup class=&quot;fn&quot;&gt;&lt;a href=&quot;http://www.inthelibrarywiththeleadpipe.org/feed/#b122a6d4-0213-45d5-ba8e-1596a5eb51b8&quot; id=&quot;b122a6d4-0213-45d5-ba8e-1596a5eb51b8-link&quot;&gt;73&lt;/a&gt;&lt;/sup&gt;&lt;/p&gt;
  1138.  
  1139.  
  1140.  
  1141. &lt;p&gt;There are also plenty of opportunities for the public to engage with the issues discussed here via library programs that foster dialogue about death ethics, cultural adaptation to climate change, and alternative concepts of time. Oral history projects and time capsules are a great way to get patrons thinking more deeply about their position in time and space. So are traveling displays like &lt;a href=&quot;https://www.sinkingandmelting.org/&quot; rel=&quot;noreferrer noopener&quot; target=&quot;_blank&quot;&gt;&lt;em&gt;A People’s Archive of Sinking and Melting&lt;/em&gt;&lt;/a&gt;, an art exhibit and global community archive that solicits objects linked to climate events from around the world. Public libraries can also take advantage of their genealogical resources to invite patrons to interrogate linear visualizations of time and participate in speculative dialogue with their ancestors and descendants.&lt;sup class=&quot;fn&quot;&gt;&lt;a href=&quot;http://www.inthelibrarywiththeleadpipe.org/feed/#4f90ed06-38b1-4835-b68c-955945d73515&quot; id=&quot;4f90ed06-38b1-4835-b68c-955945d73515-link&quot;&gt;74&lt;/a&gt;&lt;/sup&gt; And many public libraries already engage mental health professionals to facilitate programming, so the scope of those sorts of programs could expand to include climate-related stress and anxiety.&lt;/p&gt;
  1142.  
  1143.  
  1144.  
  1145. &lt;p&gt;There is also potential for public and academic libraries to facilitate public mourning and remembrance. They can create programming around Remembrance Day for Lost Species, an annual international observance started by U.K. artists in 2011 that takes place November 30th of each year.&lt;sup class=&quot;fn&quot;&gt;&lt;a href=&quot;http://www.inthelibrarywiththeleadpipe.org/feed/#532c3111-3e05-4304-bb56-c3edd974e905&quot; id=&quot;532c3111-3e05-4304-bb56-c3edd974e905-link&quot;&gt;75&lt;/a&gt;&lt;/sup&gt; They can engage artists and activists to create physical memorials to extinct species onsite in and around library spaces. And they can host grieving rituals like the funeral for the Okökull glacier held by activists in Iceland in 2019&lt;sup class=&quot;fn&quot;&gt;&lt;a href=&quot;http://www.inthelibrarywiththeleadpipe.org/feed/#02d3e66b-6108-4509-8c81-122a1c37855b&quot; id=&quot;02d3e66b-6108-4509-8c81-122a1c37855b-link&quot;&gt;76&lt;/a&gt;&lt;/sup&gt;, embroidery artist Kate Tume’s &lt;a href=&quot;https://www.lostspeciesday.org/?p=1274&quot; rel=&quot;noreferrer noopener&quot; target=&quot;_blank&quot;&gt;Hallowed Ground&lt;/a&gt; project embroidering critically endangered animals’ habitats while leaving empty space in the shapes of the beings themselves, and the nondenominational &lt;a href=&quot;https://bombyx.live/events/earth-grief/&quot; rel=&quot;noreferrer noopener&quot; target=&quot;_blank&quot;&gt;Earth Grief&lt;/a&gt; event that took place at Bombyx Center for Arts and Equity in Florence, Massachusetts in July 2024.&lt;/p&gt;
  1146.  
  1147.  
  1148.  
  1149. &lt;p&gt;And then, of course, there’s fiction. There is a reason why so much environmental humanities scholarship overtly employs speculative fiction to theorize&lt;sup class=&quot;fn&quot;&gt;&lt;a href=&quot;http://www.inthelibrarywiththeleadpipe.org/feed/#e82fddd3-cd76-4244-928d-17825e1d4609&quot; id=&quot;e82fddd3-cd76-4244-928d-17825e1d4609-link&quot;&gt;77&lt;/a&gt;&lt;/sup&gt;; here Le Guin’s &lt;em&gt;Always Coming Home &lt;/em&gt;informed an entire way of conceptualizing time. Fiction frequently serves as our only frame of reference for imagining our shared climate future. Its authors, through their extensive worldbuilding, provide kernels of possible detail about the conditions of our intergenerational reality that can help readers begin to reckon with the changes they are facing in the context of coexisting geopolitical and socioeconomic factors.&lt;sup class=&quot;fn&quot;&gt;&lt;a href=&quot;http://www.inthelibrarywiththeleadpipe.org/feed/#ddc5f952-2a08-4286-a3c6-3d19baa2b7d7&quot; id=&quot;ddc5f952-2a08-4286-a3c6-3d19baa2b7d7-link&quot;&gt;78&lt;/a&gt;&lt;/sup&gt; Beyond devoting budgetary resources to purchasing speculative climate fiction for their collections, libraries can organize programming like book clubs and writing workshops to encourage their patrons to explore the genre further.&lt;/p&gt;
  1150.  
  1151.  
  1152.  
  1153. &lt;p&gt;Donna Haraway, who is a bit of a celebrity among environmental humanists, advises us to envision the Anthropocene as “more a boundary event than an epoch” and reduce it as much as we can in order to move on to a future in which the trajectory has changed.&lt;sup class=&quot;fn&quot;&gt;&lt;a href=&quot;http://www.inthelibrarywiththeleadpipe.org/feed/#9a843341-ff25-4a7e-99ed-7452ff1a6a19&quot; id=&quot;9a843341-ff25-4a7e-99ed-7452ff1a6a19-link&quot;&gt;79&lt;/a&gt;&lt;/sup&gt; Colebrook describes that next possible epoch as one in which “humanity might think beyond itself to a life of which it is only a contingent organic part.”&lt;sup class=&quot;fn&quot;&gt;&lt;a href=&quot;http://www.inthelibrarywiththeleadpipe.org/feed/#1ab18060-2610-4420-bce5-71e176760eac&quot; id=&quot;1ab18060-2610-4420-bce5-71e176760eac-link&quot;&gt;80&lt;/a&gt;&lt;/sup&gt; But making this vital shift—the one that Rick Crownshaw believes cultural memory studies is “ill equipped” for—comes at a cost. &lt;/p&gt;
  1154.  
  1155.  
  1156.  
  1157. &lt;p&gt;Whyte calls that cost “[giving up] the underlying conditions of domination that dispossess Indigenous people.”&lt;sup class=&quot;fn&quot;&gt;&lt;a href=&quot;http://www.inthelibrarywiththeleadpipe.org/feed/#5ef0388e-94dc-4cc2-aa7b-b04f5936839a&quot; id=&quot;5ef0388e-94dc-4cc2-aa7b-b04f5936839a-link&quot;&gt;81&lt;/a&gt;&lt;/sup&gt; Yusoff calls it the “sacrificial responsibility” to leave fossil fuels in the ground.&lt;sup class=&quot;fn&quot;&gt;&lt;a href=&quot;http://www.inthelibrarywiththeleadpipe.org/feed/#91251989-82d3-4836-8dac-bd842c41f8e6&quot; id=&quot;91251989-82d3-4836-8dac-bd842c41f8e6-link&quot;&gt;82&lt;/a&gt;&lt;/sup&gt; Roy Scranton calls it “adapting, with mortal humility, to our new reality.”&lt;sup class=&quot;fn&quot;&gt;&lt;a href=&quot;http://www.inthelibrarywiththeleadpipe.org/feed/#5cc239dc-1c88-4a07-9ee8-9789969247bb&quot; id=&quot;5cc239dc-1c88-4a07-9ee8-9789969247bb-link&quot;&gt;83&lt;/a&gt;&lt;/sup&gt;&lt;/p&gt;
  1158.  
  1159.  
  1160.  
  1161. &lt;p&gt;This piece seems critical, and it is where I intend to take this research from here. Perhaps, by more explicitly exploring library science’s relationship with time, commodity fetishism, and industrial colonialism, we can begin to parse the meaning of “mortal humility” for our field.&lt;sup class=&quot;fn&quot;&gt;&lt;a href=&quot;http://www.inthelibrarywiththeleadpipe.org/feed/#6613eaf9-1eed-42e9-baf3-a1b6ab67231f&quot; id=&quot;6613eaf9-1eed-42e9-baf3-a1b6ab67231f-link&quot;&gt;84&lt;/a&gt;&lt;/sup&gt; Here and now, from our location within linear time, we must enact a kinship-based, mortality-embracing, finality-resisting, intergenerational way forward. We need a librarianship of the future.&lt;/p&gt;
  1162.  
  1163.  
  1164.  
  1165. &lt;hr class=&quot;wp-block-separator has-alpha-channel-opacity&quot; /&gt;
  1166.  
  1167.  
  1168.  
  1169. &lt;h3 class=&quot;wp-block-heading&quot;&gt;&lt;strong&gt;Acknowledgments&lt;/strong&gt;&lt;/h3&gt;
  1170.  
  1171.  
  1172.  
  1173. &lt;p&gt;&lt;em&gt;I’d like to thank reviewers Katherine Witzig and Ryan Randall, as well as publishing editor Ian Beilin, for all of their work in preparing this piece for publication. Another huge thanks goes to Donia Conn for guiding me through the research and writing process; her willingness to say yes and her frank concern for the future of our planet have made me feel at home in this field. Thanks, also, to Eric Poulin, Jude Graether, Simmons University’s School for Library and Information Science,  and all of the friends and family who welcomed my conversations about time and trees and extinction in the summer of 2023 as I recovered from reading The Overstory by Richard Powers.&lt;/em&gt;&lt;/p&gt;
  1174.  
  1175.  
  1176.  
  1177. &lt;hr class=&quot;wp-block-separator has-alpha-channel-opacity&quot; /&gt;
  1178.  
  1179.  
  1180.  
  1181. &lt;h3 class=&quot;wp-block-heading&quot;&gt;&lt;strong&gt;Bibliography&lt;/strong&gt;&lt;/h3&gt;
  1182.  
  1183.  
  1184.  
  1185. &lt;p&gt;Ahmed, Sara. &lt;em&gt;The Cultural Politics of Emotion&lt;/em&gt;. New York: Routledge, 2004.&lt;/p&gt;
  1186.  
  1187.  
  1188.  
  1189. &lt;p&gt;Atkinson, Jennifer. “Mourning Climate Loss: Ritual and Collective Grief in the Age of Crisis.” &lt;em&gt;CSPA Quarterly&lt;/em&gt;, no. 32 (2021): 8–19. &lt;a href=&quot;https://www.jstor.org/stable/27090383&quot;&gt;https://www.jstor.org/stable/27090383&lt;/a&gt;. &lt;/p&gt;
  1190.  
  1191.  
  1192.  
  1193. &lt;p&gt;Barad, Karen Michelle. &lt;em&gt;Meeting the Universe Halfway: Quantum Physics and the Entanglement of Matter and Meaning&lt;/em&gt;. Durham: Duke University Press, 2007.&lt;/p&gt;
  1194.  
  1195.  
  1196.  
  1197. &lt;p&gt;Calhoun, Joshua. &lt;em&gt;The Nature of the Page: Poetry, Papermaking, and the Ecology of Texts in Renaissance England&lt;/em&gt;. Material Texts. Philadelphia: University of Pennsylvania Press, 2020.&lt;/p&gt;
  1198.  
  1199.  
  1200.  
  1201. &lt;p&gt;Chisholm Hatfield, Samantha, Elizabeth Marino, Kyle Powys Whyte, Kathie D. Dello, and Philip W. Mote. “Indian Time: Time, Seasonality, and Culture in Traditional Ecological Knowledge of Climate Change.” &lt;em&gt;Ecological Processes&lt;/em&gt; 7, no. 1 (December 2018): 25. &lt;a href=&quot;https://doi.org/10.1186/s13717-018-0136-6&quot;&gt;https://doi.org/10.1186/s13717-018-0136-6&lt;/a&gt;. &lt;/p&gt;
  1202.  
  1203.  
  1204.  
  1205. &lt;p&gt;Cifor, Marika. “Stains and Remains: Liveliness, Materiality, and the Archival Lives of Queer Bodies.” &lt;em&gt;Australian Feminist Studies&lt;/em&gt; 32, no. 91–92 (April 3, 2017): 5–21. &lt;a href=&quot;https://doi.org/10.1080/08164649.2017.1357014&quot;&gt;https://doi.org/10.1080/08164649.2017.1357014&lt;/a&gt;. &lt;/p&gt;
  1206.  
  1207.  
  1208.  
  1209. &lt;p&gt;Clark, Timothy. “Derangements of Scale.” In &lt;em&gt;Telemorphosis: Theory in the Era of Climate Change&lt;/em&gt;, edited by Tom Cohen, 1:148–66. Critical Climate Change. Open Humanities Press, 2012. &lt;a href=&quot;https://durham-repository.worktribe.com/output/1656912&quot;&gt;https://durham-repository.worktribe.com/output/1656912&lt;/a&gt;. &lt;/p&gt;
  1210.  
  1211.  
  1212.  
  1213. &lt;p&gt;Craps, Stef. “Climate Change and the Art of Anticipatory Memory.” &lt;em&gt;Parallax&lt;/em&gt; 23, no. 4 (October 2, 2017): 479–92. &lt;a href=&quot;https://doi.org/10.1080/13534645.2017.1374518&quot;&gt;https://doi.org/10.1080/13534645.2017.1374518&lt;/a&gt;.&lt;/p&gt;
  1214.  
  1215.  
  1216.  
  1217. &lt;p&gt;———. “Guilty Grieving in an Age of Ecocide.” &lt;em&gt;Parallax&lt;/em&gt; 29, no. 3 (July 3, 2023): 323–42. &lt;a href=&quot;https://doi.org/10.1080/13534645.2024.2302666&quot;&gt;https://doi.org/10.1080/13534645.2024.2302666&lt;/a&gt;. &lt;/p&gt;
  1218.  
  1219.  
  1220.  
  1221. &lt;p&gt;Craps, Stef, Rick Crownshaw, Jennifer Wenzel, Rosanne Kennedy, Claire Colebrook, and Vin Nardizzi. “Memory Studies and the Anthropocene: A Roundtable.” &lt;em&gt;Memory Studies&lt;/em&gt; 11, no. 4 (October 2018): 498–515. &lt;a href=&quot;https://doi.org/10.1177/1750698017731068&quot;&gt;https://doi.org/10.1177/1750698017731068&lt;/a&gt;. &lt;/p&gt;
  1222.  
  1223.  
  1224.  
  1225. &lt;p&gt;Cunsolo, Ashlee, and Neville R. Ellis. “Ecological Grief as a Mental Health Response to Climate Change-Related Loss.” &lt;em&gt;Nature Climate Change&lt;/em&gt; 8, no. 4 (April 2018): 275–81. &lt;a href=&quot;https://doi.org/10.1038/s41558-018-0092-2&quot;&gt;https://doi.org/10.1038/s41558-018-0092-2&lt;/a&gt;. &lt;/p&gt;
  1226.  
  1227.  
  1228.  
  1229. &lt;p&gt;Davis, Heather, and Zoe Todd. “On the Importance of a Date, or, Decolonizing the Anthropocene.” &lt;em&gt;ACME: An International Journal for Critical Geographies&lt;/em&gt;, December 20, 2017, 761-780 Pages. &lt;a href=&quot;https://doi.org/10.14288/ACME.V16I4.1539&quot;&gt;https://doi.org/10.14288/ACME.V16I4.1539&lt;/a&gt;. &lt;/p&gt;
  1230.  
  1231.  
  1232.  
  1233. &lt;p&gt;de Massol de Rebetz, Clara. “Remembrance Day for Lost Species: Remembering and Mourning Extinction in the Anthropocene.” &lt;em&gt;Memory Studies&lt;/em&gt; 13, no. 5 (October 2020): 875–88. &lt;a href=&quot;https://doi.org/10.1177/1750698020944605&quot;&gt;https://doi.org/10.1177/1750698020944605&lt;/a&gt;.&lt;/p&gt;
  1234.  
  1235.  
  1236.  
  1237. &lt;p&gt;Gibson, Julia D. “Practicing Palliation for Extinction and Climate Change.” &lt;em&gt;Environmental Humanities&lt;/em&gt; 15, no. 1 (March 1, 2023): 208–30. &lt;a href=&quot;https://doi.org/10.1215/22011919-10216250&quot;&gt;https://doi.org/10.1215/22011919-10216250&lt;/a&gt;.&lt;/p&gt;
  1238.  
  1239.  
  1240.  
  1241. &lt;p&gt;Haraway, Donna. “Anthropocene, Capitalocene, Plantationocene, Chthulucene: Making Kin.” &lt;em&gt;Environmental Humanities&lt;/em&gt; 6, no. 1 (May 1, 2015): 159–65. &lt;a href=&quot;https://doi.org/10.1215/22011919-3615934&quot;&gt;https://doi.org/10.1215/22011919-3615934&lt;/a&gt;. &lt;/p&gt;
  1242.  
  1243.  
  1244.  
  1245. &lt;p&gt;Le Guin, Ursula K. “Speech in Acceptance of the National Book Foundation Medal for Distinguished Contribution to American Letters.” Ursula K. Le Guin. Accessed April 28, 2025. &lt;a href=&quot;https://www.ursulakleguin.com/nbf-medal&quot;&gt;https://www.ursulakleguin.com/nbf-medal&lt;/a&gt;. &lt;/p&gt;
  1246.  
  1247.  
  1248.  
  1249. &lt;p&gt;Le Guin, Ursula K., Todd Barton, Margaret Chodos-Irvine, and Shruti Swamy. &lt;em&gt;Always Coming Home&lt;/em&gt;. New York : London: Harper Perennial, 2023.&lt;/p&gt;
  1250.  
  1251.  
  1252.  
  1253. &lt;p&gt;Lewis, Simon L., and Mark A. Maslin. “Defining the Anthropocene.” &lt;em&gt;Nature&lt;/em&gt; 519, no. 7542 (March 2015): 171–80. &lt;a href=&quot;https://doi.org/10.1038/nature14258&quot;&gt;https://doi.org/10.1038/nature14258&lt;/a&gt;. &lt;/p&gt;
  1254.  
  1255.  
  1256.  
  1257. &lt;p&gt;Lövbrand, Eva, Silke Beck, Jason Chilvers, Tim Forsyth, Johan Hedrén, Mike Hulme, Rolf Lidskog, and Eleftheria Vasileiadou. “Who Speaks for the Future of Earth? How Critical Social Science Can Extend the Conversation on the Anthropocene.” &lt;em&gt;Global Environmental Change&lt;/em&gt; 32 (May 2015): 211–18. &lt;a href=&quot;https://doi.org/10.1016/j.gloenvcha.2015.03.012&quot;&gt;https://doi.org/10.1016/j.gloenvcha.2015.03.012&lt;/a&gt;.&lt;/p&gt;
  1258.  
  1259.  
  1260.  
  1261. &lt;p&gt;Lowrey, Annie. “Why the Phrase ‘Late Capitalism’ Is Suddenly Everywhere.” &lt;em&gt;The Atlantic&lt;/em&gt;, May 1, 2017. &lt;a href=&quot;https://www.theatlantic.com/business/archive/2017/05/late-capitalism/524943/&quot;&gt;https://www.theatlantic.com/business/archive/2017/05/late-capitalism/524943/&lt;/a&gt;. &lt;/p&gt;
  1262.  
  1263.  
  1264.  
  1265. &lt;p&gt;Radio, Erik. “Documents for the Nonhuman.” &lt;em&gt;Journal of Critical Library and Information Studies&lt;/em&gt; 3, no. 1 (May 17, 2020). &lt;a href=&quot;https://doi.org/10.24242/jclis.v3i1.108&quot;&gt;https://doi.org/10.24242/jclis.v3i1.108&lt;/a&gt;. &lt;/p&gt;
  1266.  
  1267.  
  1268.  
  1269. &lt;p&gt;Scranton, Roy. “Learning How to Die in the Anthropocene.” Opinionator, November 10, 2013. &lt;a href=&quot;https://archive.nytimes.com/opinionator.blogs.nytimes.com/2013/11/10/learning-how-to-die-in-the-anthropocene/&quot;&gt;https://archive.nytimes.com/opinionator.blogs.nytimes.com/2013/11/10/learning-how-to-die-in-the-anthropocene/&lt;/a&gt;. &lt;/p&gt;
  1270.  
  1271.  
  1272.  
  1273. &lt;p&gt;Sharpe, Christina Elizabeth. &lt;em&gt;In the Wake: On Blackness and Being&lt;/em&gt;. Durham London: Duke University Press, 2016.&lt;/p&gt;
  1274.  
  1275.  
  1276.  
  1277. &lt;p&gt;Stuchel, Dani. “Material Provocations in the Archives.” &lt;em&gt;Journal of Critical Library and Information Studies&lt;/em&gt; 3, no. 1 (May 17, 2020). &lt;a href=&quot;https://doi.org/10.24242/jclis.v3i1.103&quot;&gt;https://doi.org/10.24242/jclis.v3i1.103&lt;/a&gt;. &lt;/p&gt;
  1278.  
  1279.  
  1280.  
  1281. &lt;p&gt;Verywell Mind. “What Is the Proximity Principle in Psychology?” Accessed April 28, 2025. &lt;a href=&quot;https://www.verywellmind.com/what-is-the-proximity-principle-in-psychology-5195099&quot;&gt;https://www.verywellmind.com/what-is-the-proximity-principle-in-psychology-5195099&lt;/a&gt;. &lt;/p&gt;
  1282.  
  1283.  
  1284.  
  1285. &lt;p&gt;Wallace-Wells, David. “The Uninhabitable Earth.” Intelligencer. New York Magazine, July 9, 2017. &lt;a href=&quot;https://nymag.com/intelligencer/2017/07/climate-change-earth-too-hot-for-humans.html&quot;&gt;https://nymag.com/intelligencer/2017/07/climate-change-earth-too-hot-for-humans.html&lt;/a&gt;. &lt;/p&gt;
  1286.  
  1287.  
  1288.  
  1289. &lt;p&gt;Whyte, Kyle P. “Indigenous Science (Fiction) for the Anthropocene: Ancestral Dystopias and Fantasies of Climate Change Crises.” &lt;em&gt;Environment and Planning E: Nature and Space&lt;/em&gt; 1, no. 1–2 (March 2018): 224–42. &lt;a href=&quot;https://doi.org/10.1177/2514848618777621&quot;&gt;https://doi.org/10.1177/2514848618777621&lt;/a&gt;. &lt;/p&gt;
  1290.  
  1291.  
  1292.  
  1293. &lt;p&gt;Whyte, Kyle Powys. “Time as Kinship.” In &lt;em&gt;The Cambridge Companion to Environmental Humanities&lt;/em&gt;, edited by Jeffrey Cohen and Stephanie Foote. Cambridge University Press, 2021.&lt;/p&gt;
  1294.  
  1295.  
  1296.  
  1297. &lt;p&gt;Winn, Samantha R. “Dying Well In the Anthropocene: On the End of Archivists.” &lt;em&gt;Journal of Critical Library and Information Studies&lt;/em&gt; 3, no. 1 (May 17, 2020). &lt;a href=&quot;https://doi.org/10.24242/jclis.v3i1.107&quot;&gt;https://doi.org/10.24242/jclis.v3i1.107&lt;/a&gt;. &lt;/p&gt;
  1298.  
  1299.  
  1300.  
  1301. &lt;p&gt;Yusoff, Kathryn. “Geologic Life: Prehistory, Climate, Futures in the Anthropocene.” &lt;em&gt;Environment and Planning D: Society and Space&lt;/em&gt; 31, no. 5 (October 2013): 779–95. &lt;a href=&quot;https://doi.org/10.1068/d11512&quot;&gt;https://doi.org/10.1068/d11512&lt;/a&gt;. &lt;/p&gt;
  1302.  
  1303.  
  1304. &lt;ol class=&quot;wp-block-footnotes&quot;&gt;&lt;li id=&quot;b29f5d58-ce5a-4858-b332-8e422d3d67ac&quot;&gt;Le Guin, &lt;em&gt;Always Coming Home, &lt;/em&gt;3. &lt;a href=&quot;http://www.inthelibrarywiththeleadpipe.org/feed/#b29f5d58-ce5a-4858-b332-8e422d3d67ac-link&quot;&gt;&lt;img alt=&quot;↩&quot; class=&quot;wp-smiley&quot; src=&quot;https://s.w.org/images/core/emoji/15.0.3/72x72/21a9.png&quot; style=&quot;height: 1em;&quot; /&gt;︎&lt;/a&gt;&lt;/li&gt;&lt;li id=&quot;de2ca204-cb78-42c1-b1e8-9e1aa1fe0dcc&quot;&gt;Lewis and Maslin, “Defining the Anthropocene,” 171. A committee of academics from the International Commission on Stratigraphy formally rejected the designation of the Anthropocene as an epoch in March 2024 after fifteen years under consideration. Its conceptual significance to disciplines beyond geology, however, was cemented during that time. See &lt;a href=&quot;https://www.nature.com/articles/d41586-024-00675-8&quot;&gt;this update from &lt;em&gt;Nature&lt;/em&gt;&lt;/a&gt; for more information on the committee vote and Lewis and Maslin’s article for further context on the Anthropocene and its geological markers. &lt;a href=&quot;http://www.inthelibrarywiththeleadpipe.org/feed/#de2ca204-cb78-42c1-b1e8-9e1aa1fe0dcc-link&quot;&gt;&lt;img alt=&quot;↩&quot; class=&quot;wp-smiley&quot; src=&quot;https://s.w.org/images/core/emoji/15.0.3/72x72/21a9.png&quot; style=&quot;height: 1em;&quot; /&gt;︎&lt;/a&gt;&lt;/li&gt;&lt;li id=&quot;6d326017-fec3-4c28-8c79-a0648960dc80&quot;&gt;Lövbrand et al., “Who Speaks for the Future of Earth,” 211. &lt;a href=&quot;http://www.inthelibrarywiththeleadpipe.org/feed/#6d326017-fec3-4c28-8c79-a0648960dc80-link&quot;&gt;&lt;img alt=&quot;↩&quot; class=&quot;wp-smiley&quot; src=&quot;https://s.w.org/images/core/emoji/15.0.3/72x72/21a9.png&quot; style=&quot;height: 1em;&quot; /&gt;︎&lt;/a&gt;&lt;/li&gt;&lt;li id=&quot;f6d09ba1-9719-4278-9812-3de999739923&quot;&gt;See &lt;a href=&quot;https://doi.org/10.1016/j.crm.2018.03.005&quot;&gt;“American Archives and Climate Change: Risk and Adaptation”&lt;/a&gt; by Tara Mazurczyk et. al. and Amanda Oliver’s &lt;a href=&quot;https://doi.org/10.1108/RMJ-10-2020-0035&quot;&gt;“Impact of Climate Change on Canadian Archives”&lt;/a&gt; for examples. &lt;a href=&quot;http://www.inthelibrarywiththeleadpipe.org/feed/#f6d09ba1-9719-4278-9812-3de999739923-link&quot;&gt;&lt;img alt=&quot;↩&quot; class=&quot;wp-smiley&quot; src=&quot;https://s.w.org/images/core/emoji/15.0.3/72x72/21a9.png&quot; style=&quot;height: 1em;&quot; /&gt;︎&lt;/a&gt;&lt;/li&gt;&lt;li id=&quot;a88b57b0-618a-4a1c-9a87-9bf6b1bf8d8f&quot;&gt;See “&lt;a href=&quot;https://www.jstor.org/stable/10.2307/48659833&quot;&gt;Toward Environmentally Sustainable Digital Preservation”&lt;/a&gt; by Keith L. Pendergrass et al. and Ben Goldman’s &lt;a href=&quot;https://muse.jhu.edu/pub/1/article/916987&quot;&gt;“Understanding and Reducing Greenhouse Gas Emissions in Academic Libraries”&lt;/a&gt; for examples. &lt;a href=&quot;http://www.inthelibrarywiththeleadpipe.org/feed/#a88b57b0-618a-4a1c-9a87-9bf6b1bf8d8f-link&quot;&gt;&lt;img alt=&quot;↩&quot; class=&quot;wp-smiley&quot; src=&quot;https://s.w.org/images/core/emoji/15.0.3/72x72/21a9.png&quot; style=&quot;height: 1em;&quot; /&gt;︎&lt;/a&gt;&lt;/li&gt;&lt;li id=&quot;cd555baa-bf29-40bd-934c-74cb8df74272&quot;&gt;See Michael Kornfeind’s &lt;a href=&quot;https://worldlibraries.dom.edu/index.php/worldlib/article/view/594&quot;&gt;“Advocacy and Action: How Libraries Across the Globe Are Addressing Climate Change”&lt;/a&gt; and &lt;a href=&quot;https://doi.org/10.1080/13614533.2023.2230652&quot;&gt;“The Role of Academic Libraries in Climate Action”&lt;/a&gt; by Leo Appleton and Nick Woolley for examples. &lt;a href=&quot;http://www.inthelibrarywiththeleadpipe.org/feed/#cd555baa-bf29-40bd-934c-74cb8df74272-link&quot;&gt;&lt;img alt=&quot;↩&quot; class=&quot;wp-smiley&quot; src=&quot;https://s.w.org/images/core/emoji/15.0.3/72x72/21a9.png&quot; style=&quot;height: 1em;&quot; /&gt;︎&lt;/a&gt;&lt;/li&gt;&lt;li id=&quot;9d6bfed7-4fb5-4ae2-ab79-ca97e50ba8c3&quot;&gt;See &lt;a href=&quot;https://doi.org/10.24242/jclis.v3i1.156&quot;&gt;“Libraries and Archives in the Anthropocene,”&lt;/a&gt; a special issue of the &lt;em&gt;Journal of Critical Library and Information Studies &lt;/em&gt;edited by Eira Tansey and Robert D. Montoya, as well as Bethany Nowviskie’s &lt;a href=&quot;https://doi.org/10.1093/llc/fqv015&quot;&gt;“Digital Humanities in the Anthropocene”&lt;/a&gt; for insightful examples of this kind of nudging. &lt;a href=&quot;http://www.inthelibrarywiththeleadpipe.org/feed/#9d6bfed7-4fb5-4ae2-ab79-ca97e50ba8c3-link&quot;&gt;&lt;img alt=&quot;↩&quot; class=&quot;wp-smiley&quot; src=&quot;https://s.w.org/images/core/emoji/15.0.3/72x72/21a9.png&quot; style=&quot;height: 1em;&quot; /&gt;︎&lt;/a&gt;&lt;/li&gt;&lt;li id=&quot;7b2200e1-59cc-466d-8d1e-1706b79980ce&quot;&gt;Winn, “Dying Well,” 3. &lt;a href=&quot;http://www.inthelibrarywiththeleadpipe.org/feed/#7b2200e1-59cc-466d-8d1e-1706b79980ce-link&quot;&gt;&lt;img alt=&quot;↩&quot; class=&quot;wp-smiley&quot; src=&quot;https://s.w.org/images/core/emoji/15.0.3/72x72/21a9.png&quot; style=&quot;height: 1em;&quot; /&gt;︎&lt;/a&gt;&lt;/li&gt;&lt;li id=&quot;4c5922e4-9073-457d-aadc-2e72e0187e18&quot;&gt;Radio, “Documents for the Nonhuman,” 2. &lt;a href=&quot;http://www.inthelibrarywiththeleadpipe.org/feed/#4c5922e4-9073-457d-aadc-2e72e0187e18-link&quot;&gt;&lt;img alt=&quot;↩&quot; class=&quot;wp-smiley&quot; src=&quot;https://s.w.org/images/core/emoji/15.0.3/72x72/21a9.png&quot; style=&quot;height: 1em;&quot; /&gt;︎&lt;/a&gt;&lt;/li&gt;&lt;li id=&quot;b819b073-feef-4a95-9141-85fb2e4d0bb3&quot;&gt;See Winn and Radio, as well as Stuchel, “Material Provocations.” &lt;a href=&quot;http://www.inthelibrarywiththeleadpipe.org/feed/#b819b073-feef-4a95-9141-85fb2e4d0bb3-link&quot;&gt;&lt;img alt=&quot;↩&quot; class=&quot;wp-smiley&quot; src=&quot;https://s.w.org/images/core/emoji/15.0.3/72x72/21a9.png&quot; style=&quot;height: 1em;&quot; /&gt;︎&lt;/a&gt;&lt;/li&gt;&lt;li id=&quot;fe40823f-feef-4366-bc73-a1a7ecff867f&quot;&gt;Yusoff, “Geologic Life,” 781. &lt;a href=&quot;http://www.inthelibrarywiththeleadpipe.org/feed/#fe40823f-feef-4366-bc73-a1a7ecff867f-link&quot;&gt;&lt;img alt=&quot;↩&quot; class=&quot;wp-smiley&quot; src=&quot;https://s.w.org/images/core/emoji/15.0.3/72x72/21a9.png&quot; style=&quot;height: 1em;&quot; /&gt;︎&lt;/a&gt;&lt;/li&gt;&lt;li id=&quot;cd6ad65e-3e15-40b8-b51e-0e22f9778d05&quot;&gt;Yusoff, 785. &lt;a href=&quot;http://www.inthelibrarywiththeleadpipe.org/feed/#cd6ad65e-3e15-40b8-b51e-0e22f9778d05-link&quot;&gt;&lt;img alt=&quot;↩&quot; class=&quot;wp-smiley&quot; src=&quot;https://s.w.org/images/core/emoji/15.0.3/72x72/21a9.png&quot; style=&quot;height: 1em;&quot; /&gt;︎&lt;/a&gt;&lt;/li&gt;&lt;li id=&quot;a5568431-b7a4-44e6-a9be-d391c0d28331&quot;&gt;Yusoff, 781. &lt;a href=&quot;http://www.inthelibrarywiththeleadpipe.org/feed/#a5568431-b7a4-44e6-a9be-d391c0d28331-link&quot;&gt;&lt;img alt=&quot;↩&quot; class=&quot;wp-smiley&quot; src=&quot;https://s.w.org/images/core/emoji/15.0.3/72x72/21a9.png&quot; style=&quot;height: 1em;&quot; /&gt;︎&lt;/a&gt;&lt;/li&gt;&lt;li id=&quot;6a030430-c17f-45f6-aa2b-35e0fc29043e&quot;&gt;This and the preceding two quotations are from Yusoff, 784. &lt;a href=&quot;http://www.inthelibrarywiththeleadpipe.org/feed/#6a030430-c17f-45f6-aa2b-35e0fc29043e-link&quot;&gt;&lt;img alt=&quot;↩&quot; class=&quot;wp-smiley&quot; src=&quot;https://s.w.org/images/core/emoji/15.0.3/72x72/21a9.png&quot; style=&quot;height: 1em;&quot; /&gt;︎&lt;/a&gt;&lt;/li&gt;&lt;li id=&quot;29d9016f-6338-4f89-985a-f09071284704&quot;&gt;Along these lines, Claire Colebrook sees the Anthropocene as an “inscriptive event” whose impact depends on the “intensity of inscription” on our planet. Craps et al., “Memory Studies,” 507-508. &lt;a href=&quot;http://www.inthelibrarywiththeleadpipe.org/feed/#29d9016f-6338-4f89-985a-f09071284704-link&quot;&gt;&lt;img alt=&quot;↩&quot; class=&quot;wp-smiley&quot; src=&quot;https://s.w.org/images/core/emoji/15.0.3/72x72/21a9.png&quot; style=&quot;height: 1em;&quot; /&gt;︎&lt;/a&gt;&lt;/li&gt;&lt;li id=&quot;1f5e66e8-c40b-4f38-963d-f2000d0f7968&quot;&gt; Radio, “Documents for the Nonhuman,” 4-5. While perhaps unsettling to conceive of ourselves as the future material record itself rather than its interpreters, the blurred boundary between human and matter in an archival context should not come as a surprise; what Yusoff is positing is simply a different approach to what archivist Marika Cifor calls an “understanding of humans and matter as mutually constitutive.” Cifor, “Stain and Remains,” 8. Cifor considers the affective materiality of human remains in archival settings when she comes across late gay rights activist Harvey Milk’s ponytail in the GLBT Historical Society’s archives in San Francisco. What separates Milk’s hair from Yusoff’s fossilization of bone into rock due to the species-ending effects of fossil extraction is the scale of time in which the Anthropocene encourages us to place ourselves as we consider the historical record. &lt;a href=&quot;http://www.inthelibrarywiththeleadpipe.org/feed/#1f5e66e8-c40b-4f38-963d-f2000d0f7968-link&quot;&gt;&lt;img alt=&quot;↩&quot; class=&quot;wp-smiley&quot; src=&quot;https://s.w.org/images/core/emoji/15.0.3/72x72/21a9.png&quot; style=&quot;height: 1em;&quot; /&gt;︎&lt;/a&gt;&lt;/li&gt;&lt;li id=&quot;bb228b6f-e5e3-4cbf-906c-971487d24f00&quot;&gt;Whyte, “Time as Kinship,” 50. &lt;a href=&quot;http://www.inthelibrarywiththeleadpipe.org/feed/#bb228b6f-e5e3-4cbf-906c-971487d24f00-link&quot;&gt;&lt;img alt=&quot;↩&quot; class=&quot;wp-smiley&quot; src=&quot;https://s.w.org/images/core/emoji/15.0.3/72x72/21a9.png&quot; style=&quot;height: 1em;&quot; /&gt;︎&lt;/a&gt;&lt;/li&gt;&lt;li id=&quot;ece36f26-df2b-4d59-8b0b-a8a537133b19&quot;&gt;Whyte, 39. &lt;a href=&quot;http://www.inthelibrarywiththeleadpipe.org/feed/#ece36f26-df2b-4d59-8b0b-a8a537133b19-link&quot;&gt;&lt;img alt=&quot;↩&quot; class=&quot;wp-smiley&quot; src=&quot;https://s.w.org/images/core/emoji/15.0.3/72x72/21a9.png&quot; style=&quot;height: 1em;&quot; /&gt;︎&lt;/a&gt;&lt;/li&gt;&lt;li id=&quot;df737817-c5bd-4f23-b492-faa73b772443&quot;&gt;This and the preceding quotation are from Hatfield et al., “Indian Time,” 5. &lt;a href=&quot;http://www.inthelibrarywiththeleadpipe.org/feed/#df737817-c5bd-4f23-b492-faa73b772443-link&quot;&gt;&lt;img alt=&quot;↩&quot; class=&quot;wp-smiley&quot; src=&quot;https://s.w.org/images/core/emoji/15.0.3/72x72/21a9.png&quot; style=&quot;height: 1em;&quot; /&gt;︎&lt;/a&gt;&lt;/li&gt;&lt;li id=&quot;93cb57cc-8562-4d50-8f9f-ccb13e6dd5fa&quot;&gt;Whyte, “Time as Kinship,” 53. &lt;a href=&quot;http://www.inthelibrarywiththeleadpipe.org/feed/#93cb57cc-8562-4d50-8f9f-ccb13e6dd5fa-link&quot;&gt;&lt;img alt=&quot;↩&quot; class=&quot;wp-smiley&quot; src=&quot;https://s.w.org/images/core/emoji/15.0.3/72x72/21a9.png&quot; style=&quot;height: 1em;&quot; /&gt;︎&lt;/a&gt;&lt;/li&gt;&lt;li id=&quot;3d47ba75-bd46-4865-b771-73c64bd685b2&quot;&gt;Whyte, “Ancestral Dystopias,” 228. &lt;a href=&quot;http://www.inthelibrarywiththeleadpipe.org/feed/#3d47ba75-bd46-4865-b771-73c64bd685b2-link&quot;&gt;&lt;img alt=&quot;↩&quot; class=&quot;wp-smiley&quot; src=&quot;https://s.w.org/images/core/emoji/15.0.3/72x72/21a9.png&quot; style=&quot;height: 1em;&quot; /&gt;︎&lt;/a&gt;&lt;/li&gt;&lt;li id=&quot;1fcd47de-356e-4758-9ed0-50f529dfbfc9&quot;&gt;Whyte, “Ancestral Dystopias,” 229. &lt;a href=&quot;http://www.inthelibrarywiththeleadpipe.org/feed/#1fcd47de-356e-4758-9ed0-50f529dfbfc9-link&quot;&gt;&lt;img alt=&quot;↩&quot; class=&quot;wp-smiley&quot; src=&quot;https://s.w.org/images/core/emoji/15.0.3/72x72/21a9.png&quot; style=&quot;height: 1em;&quot; /&gt;︎&lt;/a&gt;&lt;/li&gt;&lt;li id=&quot;c3654cb2-e56a-4762-8b5f-e260820201ed&quot;&gt;Hatfield et al., “Indian Time,” 5. &lt;a href=&quot;http://www.inthelibrarywiththeleadpipe.org/feed/#c3654cb2-e56a-4762-8b5f-e260820201ed-link&quot;&gt;&lt;img alt=&quot;↩&quot; class=&quot;wp-smiley&quot; src=&quot;https://s.w.org/images/core/emoji/15.0.3/72x72/21a9.png&quot; style=&quot;height: 1em;&quot; /&gt;︎&lt;/a&gt;&lt;/li&gt;&lt;li id=&quot;a4cb055d-fd4d-4317-ba83-38294effefef&quot;&gt;Le Guin, &lt;em&gt;Always Coming Home, &lt;/em&gt;169. The Kesh language does not broadly distinguish between humans and non-humans, but rather treats all beings with respect and agency. This was likely informed by Le Guin’s extensive research of Indigenous relationships with land in writing the novel. See Robin Wall Kimmerer’s &lt;a href=&quot;https://birchbarkbooks.com/products/braiding-sweetgrass&quot;&gt;&lt;em&gt;Braiding Sweetgrass&lt;/em&gt;&lt;/a&gt; for more insight into animacy in Native languages and Le Guin’s &lt;a href=&quot;https://dc.swosu.edu/mythlore/vol15/iss2/1&quot;&gt;“Legends for a New Land”&lt;/a&gt; speech to learn more about the novel’s inspiration and writing process. &lt;a href=&quot;http://www.inthelibrarywiththeleadpipe.org/feed/#a4cb055d-fd4d-4317-ba83-38294effefef-link&quot;&gt;&lt;img alt=&quot;↩&quot; class=&quot;wp-smiley&quot; src=&quot;https://s.w.org/images/core/emoji/15.0.3/72x72/21a9.png&quot; style=&quot;height: 1em;&quot; /&gt;︎&lt;/a&gt;&lt;/li&gt;&lt;li id=&quot;01378c31-bdac-41d6-bebd-51913591a222&quot;&gt;Le Guin, 169. &lt;a href=&quot;http://www.inthelibrarywiththeleadpipe.org/feed/#01378c31-bdac-41d6-bebd-51913591a222-link&quot;&gt;&lt;img alt=&quot;↩&quot; class=&quot;wp-smiley&quot; src=&quot;https://s.w.org/images/core/emoji/15.0.3/72x72/21a9.png&quot; style=&quot;height: 1em;&quot; /&gt;︎&lt;/a&gt;&lt;/li&gt;&lt;li id=&quot;fd563464-b29f-4cdb-9eef-1dad3431b3dc&quot;&gt;Le Guin, 191-2. &lt;a href=&quot;http://www.inthelibrarywiththeleadpipe.org/feed/#fd563464-b29f-4cdb-9eef-1dad3431b3dc-link&quot;&gt;&lt;img alt=&quot;↩&quot; class=&quot;wp-smiley&quot; src=&quot;https://s.w.org/images/core/emoji/15.0.3/72x72/21a9.png&quot; style=&quot;height: 1em;&quot; /&gt;︎&lt;/a&gt;&lt;/li&gt;&lt;li id=&quot;fd18b95b-ab3c-47fb-9ae9-ba0be694005a&quot;&gt;Le Guin, 169. &lt;a href=&quot;http://www.inthelibrarywiththeleadpipe.org/feed/#fd18b95b-ab3c-47fb-9ae9-ba0be694005a-link&quot;&gt;&lt;img alt=&quot;↩&quot; class=&quot;wp-smiley&quot; src=&quot;https://s.w.org/images/core/emoji/15.0.3/72x72/21a9.png&quot; style=&quot;height: 1em;&quot; /&gt;︎&lt;/a&gt;&lt;/li&gt;&lt;li id=&quot;3f47cc57-172b-4480-84ee-bcba0a176c75&quot;&gt;Vinney, “Proximity Principle.” &lt;a href=&quot;http://www.inthelibrarywiththeleadpipe.org/feed/#3f47cc57-172b-4480-84ee-bcba0a176c75-link&quot;&gt;&lt;img alt=&quot;↩&quot; class=&quot;wp-smiley&quot; src=&quot;https://s.w.org/images/core/emoji/15.0.3/72x72/21a9.png&quot; style=&quot;height: 1em;&quot; /&gt;︎&lt;/a&gt;&lt;/li&gt;&lt;li id=&quot;b1f8a53b-755a-4984-a80b-3ada2cfeef80&quot;&gt;Whyte, “Ancestral Dystopias,” 228-9. &lt;a href=&quot;http://www.inthelibrarywiththeleadpipe.org/feed/#b1f8a53b-755a-4984-a80b-3ada2cfeef80-link&quot;&gt;&lt;img alt=&quot;↩&quot; class=&quot;wp-smiley&quot; src=&quot;https://s.w.org/images/core/emoji/15.0.3/72x72/21a9.png&quot; style=&quot;height: 1em;&quot; /&gt;︎&lt;/a&gt;&lt;/li&gt;&lt;li id=&quot;be15f66c-e793-4dd0-b17d-63ae83967179&quot;&gt;This and the preceding quotation are from Gibson, “Practicing Palliation,” 228. &lt;a href=&quot;http://www.inthelibrarywiththeleadpipe.org/feed/#be15f66c-e793-4dd0-b17d-63ae83967179-link&quot;&gt;&lt;img alt=&quot;↩&quot; class=&quot;wp-smiley&quot; src=&quot;https://s.w.org/images/core/emoji/15.0.3/72x72/21a9.png&quot; style=&quot;height: 1em;&quot; /&gt;︎&lt;/a&gt;&lt;/li&gt;&lt;li id=&quot;876d2761-d11e-4abb-8053-42016c969dc2&quot;&gt;Barad, &lt;em&gt;Entanglement of Matter and Meaning.&lt;/em&gt; &lt;a href=&quot;http://www.inthelibrarywiththeleadpipe.org/feed/#876d2761-d11e-4abb-8053-42016c969dc2-link&quot;&gt;&lt;img alt=&quot;↩&quot; class=&quot;wp-smiley&quot; src=&quot;https://s.w.org/images/core/emoji/15.0.3/72x72/21a9.png&quot; style=&quot;height: 1em;&quot; /&gt;︎&lt;/a&gt;&lt;/li&gt;&lt;li id=&quot;9962695d-7d30-453f-b724-0218edf3cf27&quot;&gt;Cifor, “Stains and Remains,” 6. &lt;a href=&quot;http://www.inthelibrarywiththeleadpipe.org/feed/#9962695d-7d30-453f-b724-0218edf3cf27-link&quot;&gt;&lt;img alt=&quot;↩&quot; class=&quot;wp-smiley&quot; src=&quot;https://s.w.org/images/core/emoji/15.0.3/72x72/21a9.png&quot; style=&quot;height: 1em;&quot; /&gt;︎&lt;/a&gt;&lt;/li&gt;&lt;li id=&quot;8ffb17e1-d58c-4811-abac-15ba85ff8a10&quot;&gt;Cifor, 9. &lt;a href=&quot;http://www.inthelibrarywiththeleadpipe.org/feed/#8ffb17e1-d58c-4811-abac-15ba85ff8a10-link&quot;&gt;&lt;img alt=&quot;↩&quot; class=&quot;wp-smiley&quot; src=&quot;https://s.w.org/images/core/emoji/15.0.3/72x72/21a9.png&quot; style=&quot;height: 1em;&quot; /&gt;︎&lt;/a&gt;&lt;/li&gt;&lt;li id=&quot;58abf47b-c09c-4d56-a7b6-9b543f71e633&quot;&gt;Cifor, 12. &lt;a href=&quot;http://www.inthelibrarywiththeleadpipe.org/feed/#58abf47b-c09c-4d56-a7b6-9b543f71e633-link&quot;&gt;&lt;img alt=&quot;↩&quot; class=&quot;wp-smiley&quot; src=&quot;https://s.w.org/images/core/emoji/15.0.3/72x72/21a9.png&quot; style=&quot;height: 1em;&quot; /&gt;︎&lt;/a&gt;&lt;/li&gt;&lt;li id=&quot;5515a0c1-980e-451a-8a1c-1c4f49ab40a0&quot;&gt;Cifor, 16. See Ahmed, &lt;em&gt;Cultural Politics of Emotion,&lt;/em&gt; 84 for more on this definition of disgust. &lt;a href=&quot;http://www.inthelibrarywiththeleadpipe.org/feed/#5515a0c1-980e-451a-8a1c-1c4f49ab40a0-link&quot;&gt;&lt;img alt=&quot;↩&quot; class=&quot;wp-smiley&quot; src=&quot;https://s.w.org/images/core/emoji/15.0.3/72x72/21a9.png&quot; style=&quot;height: 1em;&quot; /&gt;︎&lt;/a&gt;&lt;/li&gt;&lt;li id=&quot;0b2fd0e4-497e-469a-8c3c-52ba96189cf6&quot;&gt;Cifor, 18. &lt;a href=&quot;http://www.inthelibrarywiththeleadpipe.org/feed/#0b2fd0e4-497e-469a-8c3c-52ba96189cf6-link&quot;&gt;&lt;img alt=&quot;↩&quot; class=&quot;wp-smiley&quot; src=&quot;https://s.w.org/images/core/emoji/15.0.3/72x72/21a9.png&quot; style=&quot;height: 1em;&quot; /&gt;︎&lt;/a&gt;&lt;/li&gt;&lt;li id=&quot;d335d4c7-3c32-49b0-b5ee-cd81bf335a37&quot;&gt;Stuchel, “Material Provocations,” 15. &lt;a href=&quot;http://www.inthelibrarywiththeleadpipe.org/feed/#d335d4c7-3c32-49b0-b5ee-cd81bf335a37-link&quot;&gt;&lt;img alt=&quot;↩&quot; class=&quot;wp-smiley&quot; src=&quot;https://s.w.org/images/core/emoji/15.0.3/72x72/21a9.png&quot; style=&quot;height: 1em;&quot; /&gt;︎&lt;/a&gt;&lt;/li&gt;&lt;li id=&quot;cd783224-9811-4340-82cf-89a338dd81b7&quot;&gt;Stuchel, 15. &lt;a href=&quot;http://www.inthelibrarywiththeleadpipe.org/feed/#cd783224-9811-4340-82cf-89a338dd81b7-link&quot;&gt;&lt;img alt=&quot;↩&quot; class=&quot;wp-smiley&quot; src=&quot;https://s.w.org/images/core/emoji/15.0.3/72x72/21a9.png&quot; style=&quot;height: 1em;&quot; /&gt;︎&lt;/a&gt;&lt;/li&gt;&lt;li id=&quot;4afabc36-251a-4e9e-af73-e3d0409cea93&quot;&gt;Stuchel, 4. &lt;a href=&quot;http://www.inthelibrarywiththeleadpipe.org/feed/#4afabc36-251a-4e9e-af73-e3d0409cea93-link&quot;&gt;&lt;img alt=&quot;↩&quot; class=&quot;wp-smiley&quot; src=&quot;https://s.w.org/images/core/emoji/15.0.3/72x72/21a9.png&quot; style=&quot;height: 1em;&quot; /&gt;︎&lt;/a&gt;&lt;/li&gt;&lt;li id=&quot;75163e1e-cddd-44d3-8ff9-0ff420994ee6&quot;&gt;Stuchel, 18. &lt;a href=&quot;http://www.inthelibrarywiththeleadpipe.org/feed/#75163e1e-cddd-44d3-8ff9-0ff420994ee6-link&quot;&gt;&lt;img alt=&quot;↩&quot; class=&quot;wp-smiley&quot; src=&quot;https://s.w.org/images/core/emoji/15.0.3/72x72/21a9.png&quot; style=&quot;height: 1em;&quot; /&gt;︎&lt;/a&gt;&lt;/li&gt;&lt;li id=&quot;9de0c803-0e87-43da-898f-36ce4776f32d&quot;&gt;Le Guin, &lt;em&gt;Always Coming Home, &lt;/em&gt;576-7. &lt;a href=&quot;http://www.inthelibrarywiththeleadpipe.org/feed/#9de0c803-0e87-43da-898f-36ce4776f32d-link&quot;&gt;&lt;img alt=&quot;↩&quot; class=&quot;wp-smiley&quot; src=&quot;https://s.w.org/images/core/emoji/15.0.3/72x72/21a9.png&quot; style=&quot;height: 1em;&quot; /&gt;︎&lt;/a&gt;&lt;/li&gt;&lt;li id=&quot;63c49b3d-2880-4959-883a-8440f26050e8&quot;&gt;Stuchel, “Material Provocations,” 4. &lt;a href=&quot;http://www.inthelibrarywiththeleadpipe.org/feed/#63c49b3d-2880-4959-883a-8440f26050e8-link&quot;&gt;&lt;img alt=&quot;↩&quot; class=&quot;wp-smiley&quot; src=&quot;https://s.w.org/images/core/emoji/15.0.3/72x72/21a9.png&quot; style=&quot;height: 1em;&quot; /&gt;︎&lt;/a&gt;&lt;/li&gt;&lt;li id=&quot;ebc2cf24-f36e-4325-ab70-0ad7b0d9c92b&quot;&gt;Stuchel, 20. &lt;a href=&quot;http://www.inthelibrarywiththeleadpipe.org/feed/#ebc2cf24-f36e-4325-ab70-0ad7b0d9c92b-link&quot;&gt;&lt;img alt=&quot;↩&quot; class=&quot;wp-smiley&quot; src=&quot;https://s.w.org/images/core/emoji/15.0.3/72x72/21a9.png&quot; style=&quot;height: 1em;&quot; /&gt;︎&lt;/a&gt;&lt;/li&gt;&lt;li id=&quot;3f857653-4b35-40cc-8a14-53c97ec8c6d3&quot;&gt;For more on ecological grief, see Cunsolo and Ellis, “Ecological Grief;” Craps, “Guilty Grieving;” and Atkinson, “Mourning Climate Loss.” &lt;a href=&quot;http://www.inthelibrarywiththeleadpipe.org/feed/#3f857653-4b35-40cc-8a14-53c97ec8c6d3-link&quot;&gt;&lt;img alt=&quot;↩&quot; class=&quot;wp-smiley&quot; src=&quot;https://s.w.org/images/core/emoji/15.0.3/72x72/21a9.png&quot; style=&quot;height: 1em;&quot; /&gt;︎&lt;/a&gt;&lt;/li&gt;&lt;li id=&quot;fcf30ac4-d70a-4200-bc12-3f01e2bcd030&quot;&gt;Gibson, “Practicing Palliation,” 216. &lt;a href=&quot;http://www.inthelibrarywiththeleadpipe.org/feed/#fcf30ac4-d70a-4200-bc12-3f01e2bcd030-link&quot;&gt;&lt;img alt=&quot;↩&quot; class=&quot;wp-smiley&quot; src=&quot;https://s.w.org/images/core/emoji/15.0.3/72x72/21a9.png&quot; style=&quot;height: 1em;&quot; /&gt;︎&lt;/a&gt;&lt;/li&gt;&lt;li id=&quot;3bc7a048-659a-4b56-8a02-b41a8251b4cd&quot;&gt;Cifor, 18. &lt;a href=&quot;http://www.inthelibrarywiththeleadpipe.org/feed/#3bc7a048-659a-4b56-8a02-b41a8251b4cd-link&quot;&gt;&lt;img alt=&quot;↩&quot; class=&quot;wp-smiley&quot; src=&quot;https://s.w.org/images/core/emoji/15.0.3/72x72/21a9.png&quot; style=&quot;height: 1em;&quot; /&gt;︎&lt;/a&gt;&lt;/li&gt;&lt;li id=&quot;8ba35c4b-97bf-4163-a65d-96b3ad2bdb9c&quot;&gt;Craps et al., “Memory Studies and the Anthropocene,” 501. Here Crownshaw is applying Timothy Clark’s concept of “derangements of scale” to memory studies. Clark, “Derangements of Scale.” &lt;a href=&quot;http://www.inthelibrarywiththeleadpipe.org/feed/#8ba35c4b-97bf-4163-a65d-96b3ad2bdb9c-link&quot;&gt;&lt;img alt=&quot;↩&quot; class=&quot;wp-smiley&quot; src=&quot;https://s.w.org/images/core/emoji/15.0.3/72x72/21a9.png&quot; style=&quot;height: 1em;&quot; /&gt;︎&lt;/a&gt;&lt;/li&gt;&lt;li id=&quot;0d7fb826-2191-480e-b543-65cd831cd748&quot;&gt;Craps et al., 501. David Wallace-Wells sheds some light on these problems of scale in his &lt;a href=&quot;https://nymag.com/intelligencer/2017/07/climate-change-earth-too-hot-for-humans.html&quot; rel=&quot;noreferrer noopener&quot; target=&quot;_blank&quot;&gt;&lt;em&gt;New York Magazine &lt;/em&gt;piece&lt;/a&gt; detailing likely climate scenarios over the next century. Reasons for our “incredible failure of imagination” regarding this issue include “the smallness (two degrees), and largeness (1.8 trillion tons), and abstractness (400 parts per million) of the numbers,” as well as “the simple speed of change and, also, its slowness, such that we are only seeing effects now of warming from decades past.” Wallace-Wells, “Uninhabitable Earth.” He also explains multiple examples of the feedback loops Crownshaw refers to, including projections of melting permafrost and drought-induced rainforest fires releasing enormous amounts of carbon into the atmosphere and therefore continuing the cycle. The article, though accessible and informative, is not for the faint of heart.  &lt;a href=&quot;http://www.inthelibrarywiththeleadpipe.org/feed/#0d7fb826-2191-480e-b543-65cd831cd748-link&quot;&gt;&lt;img alt=&quot;↩&quot; class=&quot;wp-smiley&quot; src=&quot;https://s.w.org/images/core/emoji/15.0.3/72x72/21a9.png&quot; style=&quot;height: 1em;&quot; /&gt;︎&lt;/a&gt;&lt;/li&gt;&lt;li id=&quot;d0f0fa43-4079-4746-bf2d-6e0554440c69&quot;&gt;Craps et al., 501. &lt;a href=&quot;http://www.inthelibrarywiththeleadpipe.org/feed/#d0f0fa43-4079-4746-bf2d-6e0554440c69-link&quot;&gt;&lt;img alt=&quot;↩&quot; class=&quot;wp-smiley&quot; src=&quot;https://s.w.org/images/core/emoji/15.0.3/72x72/21a9.png&quot; style=&quot;height: 1em;&quot; /&gt;︎&lt;/a&gt;&lt;/li&gt;&lt;li id=&quot;d431a863-2629-4aad-b6c2-2c3e8c81b55f&quot;&gt;See Stuchel, “Material Provocations,” and Radio, “Documents for the Nonhuman.” &lt;a href=&quot;http://www.inthelibrarywiththeleadpipe.org/feed/#d431a863-2629-4aad-b6c2-2c3e8c81b55f-link&quot;&gt;&lt;img alt=&quot;↩&quot; class=&quot;wp-smiley&quot; src=&quot;https://s.w.org/images/core/emoji/15.0.3/72x72/21a9.png&quot; style=&quot;height: 1em;&quot; /&gt;︎&lt;/a&gt;&lt;/li&gt;&lt;li id=&quot;311fd9a5-33fb-4cd0-80da-2b0b4766acd8&quot;&gt;Craps, “Anticipatory Memory.” &lt;a href=&quot;http://www.inthelibrarywiththeleadpipe.org/feed/#311fd9a5-33fb-4cd0-80da-2b0b4766acd8-link&quot;&gt;&lt;img alt=&quot;↩&quot; class=&quot;wp-smiley&quot; src=&quot;https://s.w.org/images/core/emoji/15.0.3/72x72/21a9.png&quot; style=&quot;height: 1em;&quot; /&gt;︎&lt;/a&gt;&lt;/li&gt;&lt;li id=&quot;ccecd455-e21e-4023-83dc-3191012c4580&quot;&gt;As Wallace-Wells puts it so succinctly, “[A]version arising from fear is a form of denial, too.” Wallace-Wells, “Uninhabitable Earth.” &lt;a href=&quot;http://www.inthelibrarywiththeleadpipe.org/feed/#ccecd455-e21e-4023-83dc-3191012c4580-link&quot;&gt;&lt;img alt=&quot;↩&quot; class=&quot;wp-smiley&quot; src=&quot;https://s.w.org/images/core/emoji/15.0.3/72x72/21a9.png&quot; style=&quot;height: 1em;&quot; /&gt;︎&lt;/a&gt;&lt;/li&gt;&lt;li id=&quot;fa54bfb2-5a9a-4baa-bef8-8d5569ab6cb2&quot;&gt;Gibson, “Practicing Palliation,” 228. &lt;a href=&quot;http://www.inthelibrarywiththeleadpipe.org/feed/#fa54bfb2-5a9a-4baa-bef8-8d5569ab6cb2-link&quot;&gt;&lt;img alt=&quot;↩&quot; class=&quot;wp-smiley&quot; src=&quot;https://s.w.org/images/core/emoji/15.0.3/72x72/21a9.png&quot; style=&quot;height: 1em;&quot; /&gt;︎&lt;/a&gt;&lt;/li&gt;&lt;li id=&quot;45c4ea2f-5485-4f4b-b562-03fe5b6dff99&quot;&gt;Whyte, “Ancestral Dystopias,” 226-7. Whyte cites various scholars and activists who have written or spoken about this general idea, including Lee Sprague, Larry Gross, Grace Dillon, and Conrad Scott.  &lt;a href=&quot;http://www.inthelibrarywiththeleadpipe.org/feed/#45c4ea2f-5485-4f4b-b562-03fe5b6dff99-link&quot;&gt;&lt;img alt=&quot;↩&quot; class=&quot;wp-smiley&quot; src=&quot;https://s.w.org/images/core/emoji/15.0.3/72x72/21a9.png&quot; style=&quot;height: 1em;&quot; /&gt;︎&lt;/a&gt;&lt;/li&gt;&lt;li id=&quot;d7b1b8a2-e6af-4efa-aa01-afbd1844a4d9&quot;&gt; Whyte, “Ancestral Dystopias,” 234. &lt;a href=&quot;http://www.inthelibrarywiththeleadpipe.org/feed/#d7b1b8a2-e6af-4efa-aa01-afbd1844a4d9-link&quot;&gt;&lt;img alt=&quot;↩&quot; class=&quot;wp-smiley&quot; src=&quot;https://s.w.org/images/core/emoji/15.0.3/72x72/21a9.png&quot; style=&quot;height: 1em;&quot; /&gt;︎&lt;/a&gt;&lt;/li&gt;&lt;li id=&quot;ca936b9c-8552-4f8b-9c5a-5ac4868a58b9&quot;&gt;Whyte, “Time as Kinship,” 45, 53. &lt;a href=&quot;http://www.inthelibrarywiththeleadpipe.org/feed/#ca936b9c-8552-4f8b-9c5a-5ac4868a58b9-link&quot;&gt;&lt;img alt=&quot;↩&quot; class=&quot;wp-smiley&quot; src=&quot;https://s.w.org/images/core/emoji/15.0.3/72x72/21a9.png&quot; style=&quot;height: 1em;&quot; /&gt;︎&lt;/a&gt;&lt;/li&gt;&lt;li id=&quot;914b238e-6a72-46a2-a39f-7d4bf8b9a8de&quot;&gt;Whyte, “Time as Kinship,” 54. &lt;a href=&quot;http://www.inthelibrarywiththeleadpipe.org/feed/#914b238e-6a72-46a2-a39f-7d4bf8b9a8de-link&quot;&gt;&lt;img alt=&quot;↩&quot; class=&quot;wp-smiley&quot; src=&quot;https://s.w.org/images/core/emoji/15.0.3/72x72/21a9.png&quot; style=&quot;height: 1em;&quot; /&gt;︎&lt;/a&gt;&lt;/li&gt;&lt;li id=&quot;515ff931-2ed2-4895-a5a1-ad3cb6797292&quot;&gt;Davis and Todd, “Decolonizing the Anthropocene,” 772. &lt;a href=&quot;http://www.inthelibrarywiththeleadpipe.org/feed/#515ff931-2ed2-4895-a5a1-ad3cb6797292-link&quot;&gt;&lt;img alt=&quot;↩&quot; class=&quot;wp-smiley&quot; src=&quot;https://s.w.org/images/core/emoji/15.0.3/72x72/21a9.png&quot; style=&quot;height: 1em;&quot; /&gt;︎&lt;/a&gt;&lt;/li&gt;&lt;li id=&quot;37a77ac7-6f7c-47ee-80c0-6581e01da0bd&quot;&gt;Davis and Todd, 774. The authors note the connections between their seismic shockwave concept and scholar Christina Sharpe’s “wake work,” a metaphor illustrating the ongoing effects of the transatlantic slave trade using the image of the ships’ wake. “In the wake, the past that is not past reappears, always, to rupture the present.” Sharpe, &lt;em&gt;In the Wake,&lt;/em&gt; 9. Davis and Todd seek to “expand and pluralize collective understandings of the disasters of the Anthropocene” (772). &lt;a href=&quot;http://www.inthelibrarywiththeleadpipe.org/feed/#37a77ac7-6f7c-47ee-80c0-6581e01da0bd-link&quot;&gt;&lt;img alt=&quot;↩&quot; class=&quot;wp-smiley&quot; src=&quot;https://s.w.org/images/core/emoji/15.0.3/72x72/21a9.png&quot; style=&quot;height: 1em;&quot; /&gt;︎&lt;/a&gt;&lt;/li&gt;&lt;li id=&quot;9186e44d-7e9e-49ef-94c4-aae1dfe381f0&quot;&gt;Whyte, “Ancestral Dystopias,” 236. &lt;a href=&quot;http://www.inthelibrarywiththeleadpipe.org/feed/#9186e44d-7e9e-49ef-94c4-aae1dfe381f0-link&quot;&gt;&lt;img alt=&quot;↩&quot; class=&quot;wp-smiley&quot; src=&quot;https://s.w.org/images/core/emoji/15.0.3/72x72/21a9.png&quot; style=&quot;height: 1em;&quot; /&gt;︎&lt;/a&gt;&lt;/li&gt;&lt;li id=&quot;944922b1-6afe-42ad-b52f-f8905b783d5e&quot;&gt;Whyte, 236. &lt;a href=&quot;http://www.inthelibrarywiththeleadpipe.org/feed/#944922b1-6afe-42ad-b52f-f8905b783d5e-link&quot;&gt;&lt;img alt=&quot;↩&quot; class=&quot;wp-smiley&quot; src=&quot;https://s.w.org/images/core/emoji/15.0.3/72x72/21a9.png&quot; style=&quot;height: 1em;&quot; /&gt;︎&lt;/a&gt;&lt;/li&gt;&lt;li id=&quot;8d4bdc01-aba0-48fb-a156-98df9e71eaf1&quot;&gt;Yusoff, “Geologic Life,” 784. &lt;a href=&quot;http://www.inthelibrarywiththeleadpipe.org/feed/#8d4bdc01-aba0-48fb-a156-98df9e71eaf1-link&quot;&gt;&lt;img alt=&quot;↩&quot; class=&quot;wp-smiley&quot; src=&quot;https://s.w.org/images/core/emoji/15.0.3/72x72/21a9.png&quot; style=&quot;height: 1em;&quot; /&gt;︎&lt;/a&gt;&lt;/li&gt;&lt;li id=&quot;d6140629-cea4-4d6f-a667-d19efa75945c&quot;&gt;Davis and Todd, “Decolonizing the Anthropocene,” 769. &lt;a href=&quot;http://www.inthelibrarywiththeleadpipe.org/feed/#d6140629-cea4-4d6f-a667-d19efa75945c-link&quot;&gt;&lt;img alt=&quot;↩&quot; class=&quot;wp-smiley&quot; src=&quot;https://s.w.org/images/core/emoji/15.0.3/72x72/21a9.png&quot; style=&quot;height: 1em;&quot; /&gt;︎&lt;/a&gt;&lt;/li&gt;&lt;li id=&quot;750e96a4-42e1-40cd-b753-434701637bd8&quot;&gt;Davis and Todd, 772. &lt;a href=&quot;http://www.inthelibrarywiththeleadpipe.org/feed/#750e96a4-42e1-40cd-b753-434701637bd8-link&quot;&gt;&lt;img alt=&quot;↩&quot; class=&quot;wp-smiley&quot; src=&quot;https://s.w.org/images/core/emoji/15.0.3/72x72/21a9.png&quot; style=&quot;height: 1em;&quot; /&gt;︎&lt;/a&gt;&lt;/li&gt;&lt;li id=&quot;f39a0994-7083-4ab1-aab7-946d93ccf0c7&quot;&gt;Le Guin, “Speech.” She spoke these words in 2014, when she was eighty-four years old, during her &lt;a href=&quot;https://www.ursulakleguin.com/nbf-medal&quot; rel=&quot;noreferrer noopener&quot; target=&quot;_blank&quot;&gt;acceptance speech&lt;/a&gt; for the National Book Foundation’s Medal for Distinguished Contribution to American Letters. The context for the comment involves seeking accountability for the publishing industry’s greed and the pressure it puts on artists in its effort to maximize profits. Earlier in the speech she says, “Hard times are coming, when we’ll be wanting the voices of writers who can see alternatives to how we live now, can see through our fear-stricken society and its obsessive technologies to other ways of being, and even imagine real grounds for hope. We’ll need writers who can remember freedom—poets, visionaries—realists of a larger reality.” &lt;a href=&quot;http://www.inthelibrarywiththeleadpipe.org/feed/#f39a0994-7083-4ab1-aab7-946d93ccf0c7-link&quot;&gt;&lt;img alt=&quot;↩&quot; class=&quot;wp-smiley&quot; src=&quot;https://s.w.org/images/core/emoji/15.0.3/72x72/21a9.png&quot; style=&quot;height: 1em;&quot; /&gt;︎&lt;/a&gt;&lt;/li&gt;&lt;li id=&quot;13ed1670-b582-4979-a61f-590915c74a68&quot;&gt;As labor expert Richard Yeselson told the &lt;em&gt;Atlantic &lt;/em&gt;in 2017, “Let’s allude to the big, giant, totalistic system that is underneath everything. And let’s give it more than a hint of foreboding. Late capitalism. &lt;em&gt;Late &lt;/em&gt;is so pregnant.” Lowrey, “Late Capitalism.” &lt;a href=&quot;http://www.inthelibrarywiththeleadpipe.org/feed/#13ed1670-b582-4979-a61f-590915c74a68-link&quot;&gt;&lt;img alt=&quot;↩&quot; class=&quot;wp-smiley&quot; src=&quot;https://s.w.org/images/core/emoji/15.0.3/72x72/21a9.png&quot; style=&quot;height: 1em;&quot; /&gt;︎&lt;/a&gt;&lt;/li&gt;&lt;li id=&quot;85643f8c-0ba0-42b5-ae16-0a2e7d6d7b12&quot;&gt;Yusoff, “Geologic Life,” 782-783. &lt;a href=&quot;http://www.inthelibrarywiththeleadpipe.org/feed/#85643f8c-0ba0-42b5-ae16-0a2e7d6d7b12-link&quot;&gt;&lt;img alt=&quot;↩&quot; class=&quot;wp-smiley&quot; src=&quot;https://s.w.org/images/core/emoji/15.0.3/72x72/21a9.png&quot; style=&quot;height: 1em;&quot; /&gt;︎&lt;/a&gt;&lt;/li&gt;&lt;li id=&quot;1c96bfd6-2d51-4fa2-82fd-669986b48f69&quot;&gt;Yusoff, 782. &lt;a href=&quot;http://www.inthelibrarywiththeleadpipe.org/feed/#1c96bfd6-2d51-4fa2-82fd-669986b48f69-link&quot;&gt;&lt;img alt=&quot;↩&quot; class=&quot;wp-smiley&quot; src=&quot;https://s.w.org/images/core/emoji/15.0.3/72x72/21a9.png&quot; style=&quot;height: 1em;&quot; /&gt;︎&lt;/a&gt;&lt;/li&gt;&lt;li id=&quot;c7bad781-8abd-4087-8a7a-2f6072605442&quot;&gt;Gibson, “Practicing Palliation,” 210. &lt;a href=&quot;http://www.inthelibrarywiththeleadpipe.org/feed/#c7bad781-8abd-4087-8a7a-2f6072605442-link&quot;&gt;&lt;img alt=&quot;↩&quot; class=&quot;wp-smiley&quot; src=&quot;https://s.w.org/images/core/emoji/15.0.3/72x72/21a9.png&quot; style=&quot;height: 1em;&quot; /&gt;︎&lt;/a&gt;&lt;/li&gt;&lt;li id=&quot;c6897d60-8258-4b81-8d36-4106c1ea92f2&quot;&gt;Stuchel, “Material Provocations,” 14-15. See DeSilvey, &lt;em&gt;Curated Decay.&lt;/em&gt; &lt;a href=&quot;http://www.inthelibrarywiththeleadpipe.org/feed/#c6897d60-8258-4b81-8d36-4106c1ea92f2-link&quot;&gt;&lt;img alt=&quot;↩&quot; class=&quot;wp-smiley&quot; src=&quot;https://s.w.org/images/core/emoji/15.0.3/72x72/21a9.png&quot; style=&quot;height: 1em;&quot; /&gt;︎&lt;/a&gt;&lt;/li&gt;&lt;li id=&quot;7e74d1a8-34f9-4d17-9ae8-64b32a0a1fad&quot;&gt;Stuchel, 18 &lt;a href=&quot;http://www.inthelibrarywiththeleadpipe.org/feed/#7e74d1a8-34f9-4d17-9ae8-64b32a0a1fad-link&quot;&gt;&lt;img alt=&quot;↩&quot; class=&quot;wp-smiley&quot; src=&quot;https://s.w.org/images/core/emoji/15.0.3/72x72/21a9.png&quot; style=&quot;height: 1em;&quot; /&gt;︎&lt;/a&gt;&lt;/li&gt;&lt;li id=&quot;280c5c3b-3abf-4467-9742-30743dbdb56a&quot;&gt;For further discussion of the ecological context of manuscripts and archival documents through an environmental humanities lens, see Calhoun, &lt;em&gt;Nature of the Page.&lt;/em&gt; &lt;a href=&quot;http://www.inthelibrarywiththeleadpipe.org/feed/#280c5c3b-3abf-4467-9742-30743dbdb56a-link&quot;&gt;&lt;img alt=&quot;↩&quot; class=&quot;wp-smiley&quot; src=&quot;https://s.w.org/images/core/emoji/15.0.3/72x72/21a9.png&quot; style=&quot;height: 1em;&quot; /&gt;︎&lt;/a&gt;&lt;/li&gt;&lt;li id=&quot;b122a6d4-0213-45d5-ba8e-1596a5eb51b8&quot;&gt;Winn, “Dying Well,” 12. &lt;a href=&quot;http://www.inthelibrarywiththeleadpipe.org/feed/#b122a6d4-0213-45d5-ba8e-1596a5eb51b8-link&quot;&gt;&lt;img alt=&quot;↩&quot; class=&quot;wp-smiley&quot; src=&quot;https://s.w.org/images/core/emoji/15.0.3/72x72/21a9.png&quot; style=&quot;height: 1em;&quot; /&gt;︎&lt;/a&gt;&lt;/li&gt;&lt;li id=&quot;4f90ed06-38b1-4835-b68c-955945d73515&quot;&gt;Whyte, “Ancestral Dystopias.” &lt;a href=&quot;http://www.inthelibrarywiththeleadpipe.org/feed/#4f90ed06-38b1-4835-b68c-955945d73515-link&quot;&gt;&lt;img alt=&quot;↩&quot; class=&quot;wp-smiley&quot; src=&quot;https://s.w.org/images/core/emoji/15.0.3/72x72/21a9.png&quot; style=&quot;height: 1em;&quot; /&gt;︎&lt;/a&gt;&lt;/li&gt;&lt;li id=&quot;532c3111-3e05-4304-bb56-c3edd974e905&quot;&gt;De Massol de Rebetz, “Remembrance Day.” &lt;a href=&quot;http://www.inthelibrarywiththeleadpipe.org/feed/#532c3111-3e05-4304-bb56-c3edd974e905-link&quot;&gt;&lt;img alt=&quot;↩&quot; class=&quot;wp-smiley&quot; src=&quot;https://s.w.org/images/core/emoji/15.0.3/72x72/21a9.png&quot; style=&quot;height: 1em;&quot; /&gt;︎&lt;/a&gt;&lt;/li&gt;&lt;li id=&quot;02d3e66b-6108-4509-8c81-122a1c37855b&quot;&gt;Atkinson, “Mourning Climate Loss,” 18. &lt;a href=&quot;http://www.inthelibrarywiththeleadpipe.org/feed/#02d3e66b-6108-4509-8c81-122a1c37855b-link&quot;&gt;&lt;img alt=&quot;↩&quot; class=&quot;wp-smiley&quot; src=&quot;https://s.w.org/images/core/emoji/15.0.3/72x72/21a9.png&quot; style=&quot;height: 1em;&quot; /&gt;︎&lt;/a&gt;&lt;/li&gt;&lt;li id=&quot;e82fddd3-cd76-4244-928d-17825e1d4609&quot;&gt;Instances of this phenomenon cited in this article alone include Gibson, “Practicing Palliation;” Whyte, “Ancestral Dystopias;” Craps, “Anticipatory Memory;” Winn, “Dying Well;” and Haraway, “Making Kin.” &lt;a href=&quot;http://www.inthelibrarywiththeleadpipe.org/feed/#e82fddd3-cd76-4244-928d-17825e1d4609-link&quot;&gt;&lt;img alt=&quot;↩&quot; class=&quot;wp-smiley&quot; src=&quot;https://s.w.org/images/core/emoji/15.0.3/72x72/21a9.png&quot; style=&quot;height: 1em;&quot; /&gt;︎&lt;/a&gt;&lt;/li&gt;&lt;li id=&quot;ddc5f952-2a08-4286-a3c6-3d19baa2b7d7&quot;&gt;In one example, Tochi Onyebuchi’s &lt;a href=&quot;https://search.worldcat.org/title/1264176312&quot; rel=&quot;noreferrer noopener&quot; target=&quot;_blank&quot;&gt;&lt;em&gt;Goliath&lt;/em&gt;&lt;/a&gt;&lt;em&gt; &lt;/em&gt;considers the effects of gentrification on poor, Black residents of New Haven, Connecticut when Earth’s atmosphere is cleaned up and the global elite return from their outer space enclaves. In another, Yoko Tawada’s &lt;a href=&quot;https://search.worldcat.org/title/1298287613&quot; rel=&quot;noreferrer noopener&quot; target=&quot;_blank&quot;&gt;&lt;em&gt;Scattered All Over the Earth&lt;/em&gt;&lt;/a&gt; puts a surreal spin on language and climate migration. I led discussions about both books with patrons for Lilly Library’s Climate Fiction Book Club in Florence, Massachusetts in 2023-2024. &lt;a href=&quot;http://www.inthelibrarywiththeleadpipe.org/feed/#ddc5f952-2a08-4286-a3c6-3d19baa2b7d7-link&quot;&gt;&lt;img alt=&quot;↩&quot; class=&quot;wp-smiley&quot; src=&quot;https://s.w.org/images/core/emoji/15.0.3/72x72/21a9.png&quot; style=&quot;height: 1em;&quot; /&gt;︎&lt;/a&gt;&lt;/li&gt;&lt;li id=&quot;9a843341-ff25-4a7e-99ed-7452ff1a6a19&quot;&gt;Haraway, “Making Kin,” 60. &lt;a href=&quot;http://www.inthelibrarywiththeleadpipe.org/feed/#9a843341-ff25-4a7e-99ed-7452ff1a6a19-link&quot;&gt;&lt;img alt=&quot;↩&quot; class=&quot;wp-smiley&quot; src=&quot;https://s.w.org/images/core/emoji/15.0.3/72x72/21a9.png&quot; style=&quot;height: 1em;&quot; /&gt;︎&lt;/a&gt;&lt;/li&gt;&lt;li id=&quot;1ab18060-2610-4420-bce5-71e176760eac&quot;&gt; Craps et al., “Memory Studies,” 80. &lt;a href=&quot;http://www.inthelibrarywiththeleadpipe.org/feed/#1ab18060-2610-4420-bce5-71e176760eac-link&quot;&gt;&lt;img alt=&quot;↩&quot; class=&quot;wp-smiley&quot; src=&quot;https://s.w.org/images/core/emoji/15.0.3/72x72/21a9.png&quot; style=&quot;height: 1em;&quot; /&gt;︎&lt;/a&gt;&lt;/li&gt;&lt;li id=&quot;5ef0388e-94dc-4cc2-aa7b-b04f5936839a&quot;&gt;Whyte, “Ancestral Dystopias,” 237. &lt;a href=&quot;http://www.inthelibrarywiththeleadpipe.org/feed/#5ef0388e-94dc-4cc2-aa7b-b04f5936839a-link&quot;&gt;&lt;img alt=&quot;↩&quot; class=&quot;wp-smiley&quot; src=&quot;https://s.w.org/images/core/emoji/15.0.3/72x72/21a9.png&quot; style=&quot;height: 1em;&quot; /&gt;︎&lt;/a&gt;&lt;/li&gt;&lt;li id=&quot;91251989-82d3-4836-8dac-bd842c41f8e6&quot;&gt;Yusoff, “Geologic Life,” 791. &lt;a href=&quot;http://www.inthelibrarywiththeleadpipe.org/feed/#91251989-82d3-4836-8dac-bd842c41f8e6-link&quot;&gt;&lt;img alt=&quot;↩&quot; class=&quot;wp-smiley&quot; src=&quot;https://s.w.org/images/core/emoji/15.0.3/72x72/21a9.png&quot; style=&quot;height: 1em;&quot; /&gt;︎&lt;/a&gt;&lt;/li&gt;&lt;li id=&quot;5cc239dc-1c88-4a07-9ee8-9789969247bb&quot;&gt;Scranton, “Learning to Die.” &lt;a href=&quot;http://www.inthelibrarywiththeleadpipe.org/feed/#5cc239dc-1c88-4a07-9ee8-9789969247bb-link&quot;&gt;&lt;img alt=&quot;↩&quot; class=&quot;wp-smiley&quot; src=&quot;https://s.w.org/images/core/emoji/15.0.3/72x72/21a9.png&quot; style=&quot;height: 1em;&quot; /&gt;︎&lt;/a&gt;&lt;/li&gt;&lt;li id=&quot;6613eaf9-1eed-42e9-baf3-a1b6ab67231f&quot;&gt;Nora Almeida and Jen Hoyer provide a model for this line of research in &lt;a href=&quot;https://doi.org/10.24242/jclis.v3i1.96&quot; rel=&quot;noreferrer noopener&quot; target=&quot;_blank&quot;&gt;“Living Archives in the Anthropocene,”&lt;/a&gt; parsing the relationship between the Anthropocene, capitalism, and archival practice to highlight an alternative, decentralized “living” structure embodied by Interference Archive in Brooklyn, New York. &lt;a href=&quot;http://www.inthelibrarywiththeleadpipe.org/feed/#6613eaf9-1eed-42e9-baf3-a1b6ab67231f-link&quot;&gt;&lt;img alt=&quot;↩&quot; class=&quot;wp-smiley&quot; src=&quot;https://s.w.org/images/core/emoji/15.0.3/72x72/21a9.png&quot; style=&quot;height: 1em;&quot; /&gt;︎&lt;/a&gt;&lt;/li&gt;&lt;/ol&gt;</content>
  1305.      <author>
  1306.  <name>Nora Zahn</name>
  1307. <uri>https://www.inthelibrarywiththeleadpipe.org</uri>
  1308.      </author>
  1309.    </entry>
  1310.  
  1311.    <entry>
  1312.      <title>Open Knowledge Foundation: Join us at The Tech People Want Summit and reimagine tech for real-world needs</title>
  1313.      <link rel="alternate" type="text/html" href="https://blog.okfn.org/2025/06/24/join-us-at-the-tech-people-want-summit-and-reimagine-tech-for-real-world-needs/"/>
  1314.      <id>https://blog.okfn.org/?p=34062</id>
  1315.      <updated>2025-06-24T19:52:27+00:00</updated>
  1316.      <content type="html">&lt;p&gt;In July, OKFN is bringing together non-technical professionals – including data practitioners, communicators, project managers, and advocates – to rethink how technology supports our work in conversational sessions.&lt;/p&gt;
  1317. &lt;p&gt;The post &lt;a href=&quot;https://blog.okfn.org/2025/06/24/join-us-at-the-tech-people-want-summit-and-reimagine-tech-for-real-world-needs/&quot;&gt;Join us at The Tech People Want Summit and reimagine tech for real-world needs&lt;/a&gt; first appeared on &lt;a href=&quot;https://blog.okfn.org&quot;&gt;Open Knowledge Blog&lt;/a&gt;.&lt;/p&gt;</content>
  1318.      <author>
  1319.  <name>OKFN</name>
  1320. <uri>https://blog.okfn.org</uri>
  1321.      </author>
  1322.    </entry>
  1323.  
  1324.    <entry>
  1325.      <title>LibraryThing (Thingology): Talpa Search integration now available in Aspen Discovery</title>
  1326.      <link rel="alternate" type="text/html" href="https://blog.librarything.com/2025/06/talpa-search-integration-now-available-in-aspen-discovery/"/>
  1327.      <id>https://blog.librarything.com/?p=11291</id>
  1328.      <updated>2025-06-24T18:02:12+00:00</updated>
  1329.      <content type="html">&lt;div class=&quot;wp-block-image&quot;&gt;
  1330. &lt;figure class=&quot;alignright size-medium&quot;&gt;&lt;a href=&quot;https://www.talpasearch.com&quot;&gt;&lt;img alt=&quot;&quot; class=&quot;wp-image-11239&quot; height=&quot;300&quot; src=&quot;https://blog.librarything.com/wp-content/uploads/2025/01/talpa-logo-search_1000w-231x300.png&quot; width=&quot;231&quot; /&gt;&lt;/a&gt;&lt;/figure&gt;&lt;/div&gt;
  1331.  
  1332.  
  1333. &lt;p&gt;LibraryThing is thrilled to announce that &lt;a href=&quot;https://www.talpasearch.com/&quot;&gt;Talpa Search&lt;/a&gt; is now available in Aspen Discovery from &lt;a href=&quot;https://www.groveforlibraries.com/&quot;&gt;Grove for Libraries&lt;/a&gt;. With Talpa Search, patrons can search for books and other media using natural language. Talpa Search goes beyond traditional library searching to find items by plot details, genre, descriptions, and more. While Talpa Search works with all library systems, this is its first integration directly inside the catalog.&lt;/p&gt;
  1334.  
  1335.  
  1336.  
  1337. &lt;p&gt;Aspen Discovery libraries can now enable Talpa Search as a search mode, allowing patrons to see results from Talpa Search seamlessly integrated into the look and feel of their library catalog.&lt;/p&gt;
  1338.  
  1339.  
  1340.  
  1341. &lt;p&gt;Talpa Search in Aspen Discovery enables users to search for “novels about France during World War II” or “recent cozy mysteries” and receive relevant results, all within their existing library catalog, without special syntax or filtering.&lt;/p&gt;
  1342.  
  1343.  
  1344.  
  1345. &lt;figure class=&quot;wp-block-image size-large&quot;&gt;&lt;a href=&quot;https://demo.groveforlibraries.com/Union/Search?searchSource=talpa&quot;&gt;&lt;img alt=&quot;&quot; class=&quot;wp-image-11293&quot; height=&quot;750&quot; src=&quot;https://blog.librarything.com/wp-content/uploads/2025/06/YellowHat-1-1024x750.png&quot; width=&quot;1024&quot; /&gt;&lt;/a&gt;&lt;/figure&gt;
  1346.  
  1347.  
  1348.  
  1349. &lt;p&gt;&lt;br /&gt;Just as patrons will love the ability to search their library catalog using natural language, librarians will appreciate the customization options available with the Talpa Search module, including branding, text, and display settings. Talpa Search also includes an optional setting to display a materials request link for items not owned by the library—useful as a collection development tool.&lt;/p&gt;
  1350.  
  1351.  
  1352.  
  1353. &lt;p&gt;Talpa Search is available by subscription and can be accessed through a standalone site for those without Aspen Discovery. You can see the Talpa Search catalog integration in action &lt;a href=&quot;https://demo.groveforlibraries.com/Union/Search?searchSource=talpa&quot;&gt;here&lt;/a&gt;, and the stand-alone site at &lt;a href=&quot;https://www.talpasearch.com/&quot;&gt;https://www.talpasearch.com/&lt;/a&gt;.&lt;/p&gt;
  1354.  
  1355.  
  1356.  
  1357. &lt;h3 class=&quot;wp-block-heading&quot;&gt;&lt;a&gt;&lt;/a&gt;About Grove for Libraries&lt;/h3&gt;
  1358.  
  1359.  
  1360.  
  1361. &lt;p&gt;Grove for Libraries offers development, hosting, support, and more for Aspen Discovery, Aspen LiDA, and other projects related to Aspen. Grove employees also contribute time and expertise to all Aspen Community meetings as well as dedicated Office Hours to help grow and support all libraries and organizations using Aspen. Learn more at &lt;a href=&quot;https://www.groveforlibraries.com/&quot;&gt;https://www.groveforlibraries.com/&lt;/a&gt;&lt;/p&gt;
  1362.  
  1363.  
  1364.  
  1365. &lt;h3 class=&quot;wp-block-heading&quot;&gt;&lt;a&gt;&lt;/a&gt;About Talpa Search&lt;/h3&gt;
  1366.  
  1367.  
  1368.  
  1369. &lt;p&gt;Talpa Search combines cutting-edge technology with data from libraries, publishers, and readers to enable entirely new ways of searching—and find what you’re looking for.&lt;/p&gt;
  1370.  
  1371.  
  1372.  
  1373. &lt;p&gt;Designed specifically for libraries, Talpa Search is available by subscription, has no usage caps, and integrates with a library’s catalog and holdings.&lt;/p&gt;
  1374.  
  1375.  
  1376.  
  1377. &lt;p&gt;Talpa Search was developed by the developers, librarians, and librarian-developers of LibraryThing. Learn more at &lt;a href=&quot;https://www.talpasearch.com/about&quot;&gt;https://www.talpasearch.com/about&lt;/a&gt;.&lt;/p&gt;
  1378.  
  1379.  
  1380.  
  1381. &lt;p&gt;“Talpa” means mole in Latin. (Moles dig for things!)&lt;/p&gt;
  1382.  
  1383.  
  1384.  
  1385. &lt;h3 class=&quot;wp-block-heading&quot;&gt;About Aspen&lt;/h3&gt;
  1386.  
  1387.  
  1388.  
  1389. &lt;p&gt;Aspen is a suite of open-source library software that started with its flagship Discovery and has since expanded to include additional tools like the Aspen LiDA mobile app and Aspen Web Builder.&lt;/p&gt;
  1390.  
  1391.  
  1392.  
  1393. &lt;p&gt;Aspen was the first discovery layer to give patrons seamless access to all library resources in one place. By connecting with multiple content platforms for e-books, library events, digital archives, and other third-party services, Aspen is dedicated to avoiding vendor lock-in and empowering libraries with control by offering them options for which vendors they want to integrate with Aspen.&lt;/p&gt;
  1394.  
  1395.  
  1396.  
  1397. &lt;p&gt;Aspen is used by more than 2,000 libraries worldwide and is supported by a vibrant community of hundreds of librarians. Learn more at &lt;a href=&quot;https://community.aspendiscovery.org/&quot; rel=&quot;noreferrer noopener&quot; target=&quot;_blank&quot;&gt;https://community.aspendiscovery.org/&lt;/a&gt;&lt;/p&gt;
  1398.  
  1399.  
  1400.  
  1401. &lt;p&gt;&lt;/p&gt;</content>
  1402.      <author>
  1403.  <name>Abby</name>
  1404. <uri>https://blog.librarything.com</uri>
  1405.      </author>
  1406.    </entry>
  1407.  
  1408.    <entry>
  1409.      <title>Open Knowledge Foundation: How Open Data Editor empowers our weapon watch network</title>
  1410.      <link rel="alternate" type="text/html" href="https://blog.okfn.org/2025/06/24/how-open-data-editor-empowers-our-weapon-watch-network/"/>
  1411.      <id>https://blog.okfn.org/?p=34142</id>
  1412.      <updated>2025-06-24T17:48:53+00:00</updated>
  1413.      <content type="html">&lt;p&gt;The coordinator of The Observatoire des armements reports on his experience piloting ODE: &quot;It enabled us to better organize our data processes.&quot;&lt;/p&gt;
  1414. &lt;p&gt;The post &lt;a href=&quot;https://blog.okfn.org/2025/06/24/how-open-data-editor-empowers-our-weapon-watch-network/&quot;&gt;How Open Data Editor empowers our weapon watch network&lt;/a&gt; first appeared on &lt;a href=&quot;https://blog.okfn.org&quot;&gt;Open Knowledge Blog&lt;/a&gt;.&lt;/p&gt;</content>
  1415.      <author>
  1416.  <name>Topuzogullari Sayat</name>
  1417. <uri>https://blog.okfn.org</uri>
  1418.      </author>
  1419.    </entry>
  1420.  
  1421.    <entry>
  1422.      <title>Open Knowledge Foundation: How Open Data Editor simplifies open data analysis processes</title>
  1423.      <link rel="alternate" type="text/html" href="https://blog.okfn.org/2025/06/24/how-open-data-editor-simplifies-open-data-analysis-processes/"/>
  1424.      <id>https://blog.okfn.org/?p=34115</id>
  1425.      <updated>2025-06-24T17:25:46+00:00</updated>
  1426.      <content type="html">&lt;p&gt;The Executive Director of The Demography Project reports on their experience piloting ODE: &quot;It rekindle the spark that inspires why we do our work and boosted my team’s collegiality.&quot;&lt;/p&gt;
  1427. &lt;p&gt;The post &lt;a href=&quot;https://blog.okfn.org/2025/06/24/how-open-data-editor-simplifies-open-data-analysis-processes/&quot;&gt;How Open Data Editor simplifies open data analysis processes&lt;/a&gt; first appeared on &lt;a href=&quot;https://blog.okfn.org&quot;&gt;Open Knowledge Blog&lt;/a&gt;.&lt;/p&gt;</content>
  1428.      <author>
  1429.  <name>Richard Muraya</name>
  1430. <uri>https://blog.okfn.org</uri>
  1431.      </author>
  1432.    </entry>
  1433.  
  1434.    <entry>
  1435.      <title>Open Knowledge Foundation: How Open Data Editor improves our local government data practices</title>
  1436.      <link rel="alternate" type="text/html" href="https://blog.okfn.org/2025/06/24/how-open-data-editor-improves-our-local-government-data-practices/"/>
  1437.      <id>https://blog.okfn.org/?p=34095</id>
  1438.      <updated>2025-06-24T16:58:51+00:00</updated>
  1439.      <content type="html">&lt;p&gt;The CEO of Open Knowledge Nepal reports on their experience piloting ODE: &quot;It wasn’t just a success in terms of outputs; it was a cultural shift.&quot;&lt;/p&gt;
  1440. &lt;p&gt;The post &lt;a href=&quot;https://blog.okfn.org/2025/06/24/how-open-data-editor-improves-our-local-government-data-practices/&quot;&gt;How Open Data Editor improves our local government data practices&lt;/a&gt; first appeared on &lt;a href=&quot;https://blog.okfn.org&quot;&gt;Open Knowledge Blog&lt;/a&gt;.&lt;/p&gt;</content>
  1441.      <author>
  1442.  <name>Nikesh Balami</name>
  1443. <uri>https://blog.okfn.org</uri>
  1444.      </author>
  1445.    </entry>
  1446.  
  1447.    <entry>
  1448.      <title>Web Science and Digital Libraries (WS-DL) Group at Old Dominion University: 2025-06-24: GPU Hours Granted on Hypothesis Generation by Oak Ridge Leadership Computing Facility</title>
  1449.      <link rel="alternate" type="text/html" href="https://ws-dl.blogspot.com/2025/06/2025-06-24-gpu-hours-granted-on.html"/>
  1450.      <id>tag:blogger.com,1999:blog-953024975153422094.post-7194407816705243057</id>
  1451.      <updated>2025-06-24T15:45:03+00:00</updated>
  1452.      <content type="html">&lt;p&gt;In collaboration with Oak Ridge National Laboratory, the LAMP-SYS Lab was granted 20,000 node hours on the &lt;a href=&quot;https://www.olcf.ornl.gov/frontier/&quot;&gt;Frontier&lt;/a&gt; supercomputer cluster and 2500 node hours on the &lt;a href=&quot;https://www.olcf.ornl.gov/olcf-resources/compute-systems/andes/&quot;&gt;Andes&lt;/a&gt; supercomputer cluster. This joint proposal is a collaboration with &lt;a href=&quot;https://www.ornl.gov/staff-profile/tirthankar-ghosal&quot;&gt;Dr. Tirthankar Ghosal&lt;/a&gt;, a scientist at the &lt;a href=&quot;https://www.ornl.gov/group/aaims&quot;&gt;Advanced AI Methods at Scale (AAIMS)&lt;/a&gt; group in the &lt;a href=&quot;https://www.ornl.gov/division/nccs&quot;&gt;National Center for Computational Sciences&lt;/a&gt;, &lt;a href=&quot;https://www.ornl.gov/&quot;&gt;Oak Ridge National Laboratory&lt;/a&gt;, my student &lt;a href=&quot;https://domsoos.github.io/&quot;&gt;Dominik Soós&lt;/a&gt;, and me. &lt;/p&gt;&lt;p&gt;The goal of this proposal is to investigate the feasibility of generating hypotheses through interactions between expert LLMs and then ranking hypothesis candidates by &lt;a href=&quot;https://doi.org/10.1126/science.1240474&quot;&gt;Z-scores&lt;/a&gt;, a novelty metric developed by &lt;a href=&quot;https://www.kellogg.northwestern.edu/faculty/uzzi/htm/&quot;&gt;Dr. Uzzi&lt;/a&gt;. &lt;/p&gt;&lt;p&gt;The project will advance hypothesis generation by improving the hypothesis novelty in the candidate generation phase and the candidate selection phase. The introduction of Z-scores provides a more scalable way to automatically evaluate novelty. The multi-agent LLMs will have the potential to mitigate the consistent mistakes and biases made by a single LLM. Existing research mostly relies on data from one narrow domain (e.g., &lt;a href=&quot;https://aclanthology.org/2024.acl-long.18/&quot;&gt;ACL&lt;/a&gt;). The introduction of multiple expert LLMs will overcome the limitation by generating cross-domain hypotheses. We plan to use this exploratory effort to study the scaling efficiency of our methods for larger science datasets that cover most STEM domains (e.g., with scale - &lt;a href=&quot;https://clarivate.com/academia-government/scientific-and-academic-research/research-discovery-and-referencing/web-of-science/&quot;&gt;Web of Science &lt;/a&gt;or &lt;a href=&quot;https://citeseerx.ist.psu.edu&quot;&gt;CiteSeerX&lt;/a&gt;, etc.) and are crucial to leverage cross-disciplinary knowledge for scientific discovery. &lt;/p&gt;&lt;p&gt;&lt;a href=&quot;https://www.cs.odu.edu/~jwu/&quot;&gt;Dr. Jian Wu&lt;/a&gt;, the PI of this project, is an associate professor of &lt;a href=&quot;https://www.odu.edu/computer-science&quot;&gt;Computer Science at Old Dominion University&lt;/a&gt;, Norfolk, VA. His research interests include natural language processing, scholarly big data, information retrieval, digital libraries, and the science of science. &lt;/p&gt;&lt;p&gt;&lt;a href=&quot;https://www.blogger.com/blog/post/edit/953024975153422094/7194407816705243057&quot;&gt;Dr. Tirthankar Ghosal &lt;/a&gt;is a Staff Scientist at Oak Ridge National Laboratory’s National Center for Computational Sciences (NCCS) and will serve as Co-PI on this project. His research expertise spans AI for Science and Operations, Natural Language Processing, Large Language Models, and Machine Learning.&lt;/p&gt;&lt;p&gt;&lt;a href=&quot;https://www.blogger.com/blog/post/edit/953024975153422094/7194407816705243057&quot;&gt;Dominik Soós&lt;/a&gt; is a PhD student of Computer Science at Old Dominion University, where he also received his B.S. and M.S. degrees in Computer Science. His research interests include natural language processing, machine learning, and parallel computing. &lt;/p&gt;&lt;p&gt;&lt;br /&gt;&lt;/p&gt;&lt;p&gt;-- Jian Wu &lt;/p&gt;</content>
  1453.      <author>
  1454.  <name>Jian Wu (noreply@blogger.com)</name>
  1455. <uri>https://ws-dl.blogspot.com/</uri>
  1456.      </author>
  1457.    </entry>
  1458.  
  1459.    <entry>
  1460.      <title>Open Knowledge Foundation: How Open Data Editor helps us embrace FAIR data principles</title>
  1461.      <link rel="alternate" type="text/html" href="https://blog.okfn.org/2025/06/24/how-open-data-editor-helps-us-embrace-fair-data-principles/"/>
  1462.      <id>https://blog.okfn.org/?p=34087</id>
  1463.      <updated>2025-06-24T14:19:31+00:00</updated>
  1464.      <content type="html">&lt;p&gt;The senior advisor of the City of Zagreb reports on his experience piloting ODE: &quot;It has been incredibly beneficial in my data work.&quot;&lt;/p&gt;
  1465. &lt;p&gt;The post &lt;a href=&quot;https://blog.okfn.org/2025/06/24/how-open-data-editor-helps-us-embrace-fair-data-principles/&quot;&gt;How Open Data Editor helps us embrace FAIR data principles&lt;/a&gt; first appeared on &lt;a href=&quot;https://blog.okfn.org&quot;&gt;Open Knowledge Blog&lt;/a&gt;.&lt;/p&gt;</content>
  1466.      <author>
  1467.  <name>Kristian Ravić</name>
  1468. <uri>https://blog.okfn.org</uri>
  1469.      </author>
  1470.    </entry>
  1471.  
  1472.    <entry>
  1473.      <title>Open Knowledge Foundation: How Open Data Editor makes Genomic data approachable for our bioinformatics community</title>
  1474.      <link rel="alternate" type="text/html" href="https://blog.okfn.org/2025/06/24/how-open-data-editor-makes-genomic-data-approachable-for-our-bioinformatics-community/"/>
  1475.      <id>https://blog.okfn.org/?p=34073</id>
  1476.      <updated>2025-06-24T14:00:14+00:00</updated>
  1477.      <content type="html">&lt;p&gt;The coordinator of the Bioinformatics Hub of Kenya Initiative reports on her experience piloting ODE: &quot;It simplified everything.&quot;&lt;/p&gt;
  1478. &lt;p&gt;The post &lt;a href=&quot;https://blog.okfn.org/2025/06/24/how-open-data-editor-makes-genomic-data-approachable-for-our-bioinformatics-community/&quot;&gt;How Open Data Editor makes Genomic data approachable for our bioinformatics community&lt;/a&gt; first appeared on &lt;a href=&quot;https://blog.okfn.org&quot;&gt;Open Knowledge Blog&lt;/a&gt;.&lt;/p&gt;</content>
  1479.      <author>
  1480.  <name>Pauline Karega</name>
  1481. <uri>https://blog.okfn.org</uri>
  1482.      </author>
  1483.    </entry>
  1484.  
  1485.    <entry>
  1486.      <title>Library | Ruth Kitchin Tillman: My Coworkers Don't Want AI. They Want Macros</title>
  1487.      <link rel="alternate" type="text/html" href="https://ruthtillman.com/post/macros-not-ai/"/>
  1488.      <id>https://ruthtillman.com/post/macros-not-ai/</id>
  1489.      <updated>2025-06-24T00:00:00+00:00</updated>
  1490.      <content type="html">&lt;p&gt;My coworkers don’t want AI. They want macros.&lt;/p&gt;
  1491. &lt;p&gt;Let me back up a little. I spent April gathering and May refining and organizing requirements for a system to replace our current ILS. This meant asking a lot of people about how they use our current system, taking notes, and turning those notes into requirements. 372 requirements.&lt;sup id=&quot;fnref:1&quot;&gt;&lt;a class=&quot;footnote-ref&quot; href=&quot;https://ruthtillman.com/categories/library/index.xml#fn:1&quot;&gt;1&lt;/a&gt;&lt;/sup&gt;&lt;/p&gt;
  1492. &lt;p&gt;Going into this, I knew that some coworkers used macros to streamline tasks. I came out of it with a deeper appreciation of the different ways they’ve done so.&lt;/p&gt;
  1493. &lt;p&gt;It made me think about the various ways vendors are pitching “AI” for their systems and the disconnect between these pitches and the needs people expressed. Because library workers do want more from these systems. We just want something a bit different.&lt;/p&gt;
  1494. &lt;h2 id=&quot;a-few-ways-were-using-macros&quot;&gt;A Few Ways We’re Using Macros&lt;/h2&gt;
  1495. &lt;p&gt;Some environmental factors: We use Sirsi Symphony and the Workflows desktop client in Windows with MacroExpress.&lt;/p&gt;
  1496. &lt;h3 id=&quot;canned-notes&quot;&gt;Canned Notes&lt;/h3&gt;
  1497. &lt;p&gt;One of the simplest things we do with macros is add canned notes. These are most often action notes, which have to be added to existing records. For example, this is a standard deacidification note:&lt;/p&gt;
  1498. &lt;blockquote&gt;
  1499. &lt;p&gt;Deacidified.|c20030107.|iliquid.|xBookkeeper.|jCranberry Township, PA 16066.|kPreservation Technologies Inc.&lt;/p&gt;
  1500. &lt;/blockquote&gt;
  1501. &lt;p&gt;The text string is the same every time except for the subfield c, which grabs today’s date, e.g. 2003-01-07 in the example above.&lt;/p&gt;
  1502. &lt;h3 id=&quot;spine-label-marking&quot;&gt;Spine Label Marking&lt;/h3&gt;
  1503. &lt;p&gt;Let’s look at something more complicated. Our marking team has a set of tasks they need to perform when setting an item’s permanent location and preparing a label. They use a macro that does the following:&lt;/p&gt;
  1504. &lt;ul&gt;
  1505. &lt;li&gt;If it’s in the respective library’s main stacks: update Home Location field to the location which corresponds with the Library field (e.g. if it’s at Altoona, it would be STACKS-AA).&lt;/li&gt;
  1506. &lt;li&gt;Copy the call number to the clipboard, with some line breaks.&lt;/li&gt;
  1507. &lt;li&gt;If the Home Location has a special marking prefix: prepend the marking prefix, again with line breaks. This data isn’t in Symphony at all, it comes from a separate marking table.&lt;/li&gt;
  1508. &lt;/ul&gt;
  1509. &lt;p&gt;They then can paste the result into the label software, ensure the line breaks are correct, and print a label.&lt;/p&gt;
  1510. &lt;p&gt;This saves them:&lt;/p&gt;
  1511. &lt;ul&gt;
  1512. &lt;li&gt;Clicking in the Home Location field and typing or scrolling to find the correct Home Location (when it’s in the main stacks, otherwise they still have to do this part).&lt;/li&gt;
  1513. &lt;li&gt;Clicking in the call number field and copying the data.&lt;/li&gt;
  1514. &lt;li&gt;Opening our locally-maintained marking documentation, checking whether the Home Location is in the prefix table, and copy-pasting that separately into the labeling software.&lt;/li&gt;
  1515. &lt;/ul&gt;
  1516. &lt;p&gt;The marking table used by MacroExpress requires regular maintenance, but it’s done in parallel with updates to our department’s documentation.&lt;/p&gt;
  1517. &lt;p&gt;One macro.&lt;/p&gt;
  1518. &lt;h3 id=&quot;annex-location-management&quot;&gt;Annex Location Management&lt;/h3&gt;
  1519. &lt;p&gt;We also use Symphony to manage our Annex data. There are pros and cons to this, but I’ll be focusing on how macros makes the work much easier.&lt;/p&gt;
  1520. &lt;p&gt;When an item moves to an annex, our team needs to update the following information:&lt;/p&gt;
  1521. &lt;ul&gt;
  1522. &lt;li&gt;The item’s Library and Home Location fields&lt;/li&gt;
  1523. &lt;li&gt;The item’s very specific position within that annex&lt;/li&gt;
  1524. &lt;/ul&gt;
  1525. &lt;p&gt;The second piece of information is encoded in a new barcode. We want to record this data in our Extended Item Information fields (locally mapped) which we use for everything from retrieval to reporting. We need to be able to identify everything in a row, a shelving unit, box, even position in box. So we need it broken apart in a way that can be easily queried or used to generate pull slips. We also want to record information about the size of the box (A3, A4, etc.).&lt;/p&gt;
  1526. &lt;p&gt;To use the macro, our Annex workers click in the barcode field, press a key on their numpad, and scan in the new barcode. This performs the following steps:&lt;/p&gt;
  1527. &lt;ul&gt;
  1528. &lt;li&gt;Update barcode with the new barcode&lt;/li&gt;
  1529. &lt;li&gt;Update the Library field to UP-ANNEX&lt;/li&gt;
  1530. &lt;li&gt;Update Home Location to the specific annex at which the work is being performed&lt;/li&gt;
  1531. &lt;li&gt;Fill in locally-mapped Box Size field using a lookup table and the value entered from the numpad.&lt;/li&gt;
  1532. &lt;li&gt;Fill in a set of locally-mapped Extended Item Information fields with the values of Row, Shelf, Box, and Position within the box based on the value of the barcode.&lt;/li&gt;
  1533. &lt;/ul&gt;
  1534. &lt;p&gt;They use a special numpad which is just an ordinary external numpad that’s had labels applied to its keys. These indicate which key maps to A3, which to A4, etc. Users don’t have to memorize “Press 7 for A3,” they just have to press the A3 key.&lt;/p&gt;
  1535. &lt;p&gt;Once again, the macro eliminates the human error factor that would come with manually typing or even cutting and pasting in any of this data. It transforms a task that would take a good minute of clicking, tabbing, typing/copy-pasting into one that takes maybe 10 seconds.&lt;/p&gt;
  1536. &lt;p&gt;One macro.&lt;/p&gt;
  1537. &lt;h2 id=&quot;why-macros-not-ai&quot;&gt;Why Macros, Not “AI”&lt;/h2&gt;
  1538. &lt;p&gt;To summarize at a more abstract level, we are using macros to:&lt;/p&gt;
  1539. &lt;ul&gt;
  1540. &lt;li&gt;Enter consistent data&lt;/li&gt;
  1541. &lt;li&gt;Avoid typos&lt;/li&gt;
  1542. &lt;li&gt;Improve system design&lt;sup id=&quot;fnref:2&quot;&gt;&lt;a class=&quot;footnote-ref&quot; href=&quot;https://ruthtillman.com/categories/library/index.xml#fn:2&quot;&gt;2&lt;/a&gt;&lt;/sup&gt;&lt;/li&gt;
  1543. &lt;li&gt;Add conditional logic&lt;/li&gt;
  1544. &lt;li&gt;Save the user repetitive, mindless work&lt;/li&gt;
  1545. &lt;/ul&gt;
  1546. &lt;p&gt;At its best (pattern-recognition), “AI” is overengineered for what we need: logic and lookups.&lt;/p&gt;
  1547. &lt;p&gt;At its worst (predictive text), it’s the opposite of the very concrete and repeated things we want to be able to do.&lt;/p&gt;
  1548. &lt;p&gt;It doesn’t have to be macros. My coworkers would welcome a system which let them create custom workscreens with only the few fields they need for a regular task and let them add their own conditional logic. &lt;sup id=&quot;fnref:3&quot;&gt;&lt;a class=&quot;footnote-ref&quot; href=&quot;https://ruthtillman.com/categories/library/index.xml#fn:3&quot;&gt;3&lt;/a&gt;&lt;/sup&gt; Or they could collect barcodes and run a batch process that inserts those deacidification notes with canned text and a variable date field vs. running the macro on each record.&lt;/p&gt;
  1549. &lt;p&gt;But the direction is one of control, of refinement, of engaging with the system on one’s own terms. That is in contrast to the black box we’re being presented by both “AI” companies and library system vendors.&lt;sup id=&quot;fnref:4&quot;&gt;&lt;a class=&quot;footnote-ref&quot; href=&quot;https://ruthtillman.com/categories/library/index.xml#fn:4&quot;&gt;4&lt;/a&gt;&lt;/sup&gt;&lt;/p&gt;
  1550. &lt;p&gt;As for me? I’m excited about APIs.&lt;/p&gt;
  1551. &lt;div class=&quot;footnotes&quot;&gt;
  1552. &lt;hr /&gt;
  1553. &lt;ol&gt;
  1554. &lt;li id=&quot;fn:1&quot;&gt;
  1555. &lt;p&gt;Some of those requirements are: “Staff users can check in and check out library materials.” On the one hand, yes of course. On the other hand, a friend’s university just implemented a system in which you can only hire people on the first day of a pay period. No, you can’t hire them on the second day and make it retroactive. Sometimes you have to ask about really, really basic stuff. &lt;a class=&quot;footnote-backref&quot; href=&quot;https://ruthtillman.com/categories/library/index.xml#fnref:1&quot;&gt;↩︎&lt;/a&gt;&lt;/p&gt;
  1556. &lt;/li&gt;
  1557. &lt;li id=&quot;fn:2&quot;&gt;
  1558. &lt;p&gt;I’m thinking here in particular of the ways that our use of macros significantly reduces the need to tab between fields or scroll through controlled values in Workflows. &lt;a class=&quot;footnote-backref&quot; href=&quot;https://ruthtillman.com/categories/library/index.xml#fnref:2&quot;&gt;↩︎&lt;/a&gt;&lt;/p&gt;
  1559. &lt;/li&gt;
  1560. &lt;li id=&quot;fn:3&quot;&gt;
  1561. &lt;p&gt;I have seen some good ones developed at individual institutions using APIs. &lt;a class=&quot;footnote-backref&quot; href=&quot;https://ruthtillman.com/categories/library/index.xml#fnref:3&quot;&gt;↩︎&lt;/a&gt;&lt;/p&gt;
  1562. &lt;/li&gt;
  1563. &lt;li id=&quot;fn:4&quot;&gt;
  1564. &lt;p&gt;As I’ve said elsewhere, I think OCR and machine-learning capabilities which can support catalog record creation could be helpful! I’ve also seen great uses of these tools to transform a scanned page into a tab-separated file or reformat existing data quickly. The problem is that we’re being offered tools that also generate non-existent page counts we have to clean up or ones that propose taking over course design for faculty (ELUNA 2025 plenaries). &lt;a class=&quot;footnote-backref&quot; href=&quot;https://ruthtillman.com/categories/library/index.xml#fnref:4&quot;&gt;↩︎&lt;/a&gt;&lt;/p&gt;
  1565. &lt;/li&gt;
  1566. &lt;/ol&gt;
  1567. &lt;/div&gt;</content>
  1568.      <author>
  1569.    <name>Library | Ruth Kitchin Tillman</name>
  1570. <uri>https://ruthtillman.com/categories/library/</uri>
  1571.      </author>
  1572.    </entry>
  1573.  
  1574.    <entry>
  1575.      <title>David Fiander: Posting Org-Mode captures to Mastodon</title>
  1576.      <link rel="alternate" type="text/html" href="http://lisletters.fiander.info/2025/06/posting-org-mode-captures-to-mastodon.html"/>
  1577.      <id>tag:blogger.com,1999:blog-22451554.post-9187413634868382981</id>
  1578.      <updated>2025-06-19T18:50:36+00:00</updated>
  1579.      <content type="html">&lt;p&gt; I keep my personal reading log in an Emacs &lt;a href=&quot;https://orgmode.org/&quot;&gt;org-mode&lt;/a&gt; table. It's nice and small, and works on every computer I use, and thanks to org-mode's &lt;a href=&quot;https://orgmode.org/manual/The-Spreadsheet.html&quot;&gt;spreadsheet&lt;/a&gt; feature, I can even do some basic analysis of it. I use a &lt;a href=&quot;https://orgmode.org/manual/Capture-templates.html&quot;&gt;fancy capture template&lt;/a&gt; to collect the information about each book and store it at the end of the log without having to open the log file, navigate to the end of the table, and tab through the various fields.&lt;/p&gt;&lt;p&gt;Since it's also possible to post to the &lt;a href=&quot;https://joinmastodon.org/&quot;&gt;Mastodon microblogging platform&lt;/a&gt; from Emacs thanks to the &lt;a href=&quot;https://codeberg.org/martianh/mastodon.el&quot;&gt;mastodon.el library&lt;/a&gt;, I figured it shouldn't be too difficult to automatically post to mastodon every time I start reading a new book. And guess what! I was right.&lt;/p&gt;&lt;p&gt;This relies on the fact that org-mode provides &quot;hooks&quot; to customize the note capture process. &lt;a href=&quot;https://www.gnu.org/software/emacs/manual/html_node/emacs/Hooks.html&quot;&gt;A hook is a lisp function&lt;/a&gt; that gets run by emacs whenever the corresponding event takes place. In this case, org-mode provides several hooks related to finishing up the note capture process. The normal hooks are &quot;org-capture-before-finalize-hooks&quot;, which run before org-mode cleans up everything related to capturing the note and gives you access to the captured note, and &quot;org-capture-after-finalize-hooks&quot;, which runs at the very end of the note capturing process, which org-mode describes as &quot;Suitable for window cleanup.&quot; These hooks run every time you capture any kind of note, but recent versions of org-mode also provides a way to define &quot;before-finalize&quot; and &quot;after-finalize&quot; hooks that only run when a particular capture template is used, so that's what I did.&lt;/p&gt;&lt;p&gt;At the end of my existing &quot;make a note about starting to read a book&quot; template, I added the extra option&lt;/p&gt;&lt;p style=&quot;margin-left: 40px; text-align: left;&quot;&gt;&lt;span style=&quot;font-family: courier;&quot;&gt;:before-finalize #'djf-org-share-reading&lt;/span&gt;&lt;/p&gt;&lt;p style=&quot;text-align: left;&quot;&gt;&lt;span style=&quot;font-family: inherit;&quot;&gt; which tells org-mode to run the hook &lt;/span&gt;&lt;span style=&quot;font-family: courier;&quot;&gt;djf-org-share-reading&lt;/span&gt;&lt;span style=&quot;font-family: inherit;&quot;&gt; only when the &quot;start reading a book&quot; template is launched, and then I defined the function&lt;/span&gt;&lt;span style=&quot;font-family: courier;&quot;&gt; djf-org-share-reading:&lt;/span&gt;&lt;/p&gt;&lt;p style=&quot;margin-left: 40px; text-align: left;&quot;&gt;&lt;span style=&quot;font-family: courier;&quot;&gt;(defun djf-org-share-reading ()&lt;br /&gt;  (when (require 'mastodon nil t)&lt;br /&gt;    ;; parse out the org-capture data&lt;br /&gt;    (let* ((capture (buffer-substring (pos-bol) (pos-eol)))&lt;br /&gt;           (split-line (split-string capture &quot;|+&quot; t &quot; +&quot;))&lt;br /&gt;           (author (car split-line))&lt;br /&gt;           (title (cadr split-line))&lt;br /&gt;           (language (caddr split-line)))&lt;br /&gt;      (save-excursion&lt;br /&gt;        ;; Create the mastodon toot buffer, insert the captured&lt;br /&gt;        ;; Org information, and then post it.&lt;br /&gt;        (mastodon-toot)&lt;br /&gt;        (insert (format &quot;I just started reading %s by %s%s&quot;&lt;br /&gt;                        title author (if (string-equal language &quot;F&quot;) &quot; in French&quot; &quot;&quot;)))&lt;br /&gt;        (mastodon-toot-send)))))&lt;/span&gt;&lt;/p&gt;&lt;p style=&quot;text-align: left;&quot;&gt;&lt;span style=&quot;font-family: courier;&quot;&gt; First, I make sure that mastodon is available and loaded, then I grab and process the information about the book that I'm reading, then I call mastodon-toot to set up a buffer for creating a post, insert the message that I want to post, and then tell mastodon.el to send the post. If I haven't logged into Mastodon from Emacs yet, the call to &quot;mastodon-toot&quot; will automatically log me in. Mastodon takes care of the posting and getting rid of the buffer holding my draft post, and once djf-org-share-reading returns, org-mode cleans up the buffer it used to save the note, and I'm right back where I started in Emacs before I made the note.&lt;/span&gt;&lt;/p&gt;</content>
  1580.      <author>
  1581.  <name>David J. Fiander (noreply@blogger.com)</name>
  1582. <uri>http://lisletters.fiander.info/</uri>
  1583.      </author>
  1584.    </entry>
  1585.  
  1586.    <entry>
  1587.      <title>Mita Williams: Libraries can provide alternatives to Ai with queryable databases</title>
  1588.      <link rel="alternate" type="text/html" href="https://librarian.aedileworks.com/2025/06/19/libraries-can-provide-alternatives-to-ai-with-queryable-databases/"/>
  1589.      <id>https://librarian.aedileworks.com/?p=1994</id>
  1590.      <updated>2025-06-19T11:19:15+00:00</updated>
  1591.      <content type="html">I cannot ask my library catalogue to give me a list of Canadian Law Reviews and Journals we currently subscribe to but I *can* ask Wikidata which Canadian Law Review titles are open access..</content>
  1592.      <author>
  1593.  <name>Mita Williams</name>
  1594. <uri>https://librarian.aedileworks.com</uri>
  1595.      </author>
  1596.    </entry>
  1597.  
  1598.    <entry>
  1599.      <title>Artefacto: By librarians for librarians 1: the cataloguer’s dilemma</title>
  1600.      <link rel="alternate" type="text/html" href="https://www.artefacto.org.uk/by-librarians-for-librarians-cataloguers/"/>
  1601.      <id>https://www.artefacto.org.uk/?p=1689</id>
  1602.      <updated>2025-06-19T10:30:32+00:00</updated>
  1603.      <content type="html">In this series, we wanted to take a few moments to celebrate some of the amazing tools that have been developed for and by librarians (and other library pros). These often fill a vital gap in workflows or solve a pain point that library staff encounter in their work.  We don’t mean the big tools [...]&lt;p&gt;&lt;a class=&quot;understrap-read-more-link&quot; href=&quot;https://www.artefacto.org.uk/by-librarians-for-librarians-cataloguers/&quot;&gt;Continue Reading...&lt;/a&gt;&lt;/p&gt;
  1604. &lt;p&gt;&lt;a href=&quot;https://www.artefacto.org.uk/by-librarians-for-librarians-cataloguers/&quot; rel=&quot;nofollow&quot;&gt;Source&lt;/a&gt;&lt;/p&gt;</content>
  1605.      <author>
  1606.  <name>Artefacto</name>
  1607. <uri>https://www.artefacto.org.uk</uri>
  1608.      </author>
  1609.    </entry>
  1610.  
  1611.    <entry>
  1612.      <title>Nick Ruest: Misogyny has become a political strategy — here's how the pandemic helped make it happen</title>
  1613.      <link rel="alternate" type="text/html" href="https://ruebot.net/publication/the-conversation-202506/"/>
  1614.      <id>https://ruebot.net/publication/the-conversation-202506/</id>
  1615.      <updated>2025-06-18T00:00:00+00:00</updated>
  1616.      <content type="html"></content>
  1617.      <author>
  1618.    <name>Nick Ruest</name>
  1619. <uri>https://ruebot.net/</uri>
  1620.      </author>
  1621.    </entry>
  1622.  
  1623.    <entry>
  1624.      <title>Web Science and Digital Libraries (WS-DL) Group at Old Dominion University: 2025-06-17: The 6th Research Infrastructures for the Study of Archived Web Materials (RESAW) Conference Trip Report</title>
  1625.      <link rel="alternate" type="text/html" href="https://ws-dl.blogspot.com/2025/06/2025-06-17-6th-research-infrastructures.html"/>
  1626.      <id>tag:blogger.com,1999:blog-953024975153422094.post-2490593107995489811</id>
  1627.      <updated>2025-06-17T19:49:45+00:00</updated>
  1628.      <content type="html">&lt;p&gt; &lt;/p&gt;&lt;div class=&quot;separator&quot; style=&quot;clear: both; text-align: center;&quot;&gt;&lt;a href=&quot;https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhExYoU2Y2ct2ZHq7XFrkfPuydyw85C-2s593etpDVSDpUMuZeNdy8X7D-0Rt4CqZLojeWKZV_7Lpz_o4daF_y5uW5VvPVWN_VXNimABtNOCD7Wg2M5cQ_BrfXK__iY2apDN4lDF8SiKphFUvyJ3MFWVluuC1C23NFT4m8Vq0CzUqmvowhTBf8hMBU_EX94/s952/blog1.png&quot; style=&quot;margin-left: 1em; margin-right: 1em;&quot;&gt;&lt;img border=&quot;0&quot; height=&quot;262&quot; src=&quot;https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhExYoU2Y2ct2ZHq7XFrkfPuydyw85C-2s593etpDVSDpUMuZeNdy8X7D-0Rt4CqZLojeWKZV_7Lpz_o4daF_y5uW5VvPVWN_VXNimABtNOCD7Wg2M5cQ_BrfXK__iY2apDN4lDF8SiKphFUvyJ3MFWVluuC1C23NFT4m8Vq0CzUqmvowhTBf8hMBU_EX94/w640-h262/blog1.png&quot; width=&quot;640&quot; /&gt;&lt;/a&gt;&lt;/div&gt;&lt;br /&gt;&lt;p&gt;&lt;/p&gt;&lt;span id=&quot;docs-internal-guid-4f384af8-7fff-5e1d-0a16-92a4787fea10&quot;&gt;&lt;p dir=&quot;ltr&quot; style=&quot;line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt; text-align: center;&quot;&gt;&lt;a href=&quot;https://www.uni-siegen.de/&quot;&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;color: #1155cc; font-size: 11pt; vertical-align: baseline;&quot;&gt;Siegen University&lt;/span&gt;&lt;/a&gt;&lt;/p&gt;&lt;br /&gt;&lt;br /&gt;&lt;p dir=&quot;ltr&quot; style=&quot;line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;&quot;&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;font-size: 11pt; vertical-align: baseline;&quot;&gt;The &lt;/span&gt;&lt;a href=&quot;https://www.mediacoop.uni-siegen.de/datafiedweb/&quot;&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;color: #1155cc; font-size: 11pt; vertical-align: baseline;&quot;&gt;6th RESAW (Research Infrastructures for the Study of Archived Web Materials) conference &lt;/span&gt;&lt;/a&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;font-size: 11pt; vertical-align: baseline;&quot;&gt;took place in Siegen, Germany from June 4 – June 6, 2025. The conference occurs every other year in Europe, and features a mix of presentations along the spectrum from technical to digital humanities, by researchers affiliated with universities as well as from national and commercial web archiving organizations. The conference started off with quite a flurry when Cologne was &lt;/span&gt;&lt;a href=&quot;https://www.reuters.com/world/central-cologne-evacuated-after-discovery-wartime-bombs-2025-06-04/&quot;&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;color: #1155cc; font-size: 11pt; vertical-align: baseline;&quot;&gt;evacuated&lt;/span&gt;&lt;/a&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;font-size: 11pt; vertical-align: baseline;&quot;&gt; following discovery of WWII bombs, disrupting many travel plans into Siegen.&lt;/span&gt;&lt;/p&gt;&lt;h2 dir=&quot;ltr&quot; style=&quot;line-height: 1.38; margin-bottom: 6pt; margin-top: 18pt;&quot;&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;font-size: 16pt; font-weight: 400; vertical-align: baseline;&quot;&gt;Wednesday&lt;/span&gt;&lt;/h2&gt;&lt;h3 dir=&quot;ltr&quot; style=&quot;line-height: 1.38; margin-bottom: 4pt; margin-top: 16pt;&quot;&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;color: #434343; font-size: 14pt; font-weight: 400; vertical-align: baseline;&quot;&gt;Workshop: Mentorship for Early Career Scholars in Web Archive Studies&lt;/span&gt;&lt;/h3&gt;&lt;br /&gt;&lt;p dir=&quot;ltr&quot; style=&quot;line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;&quot;&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;font-size: 11pt; vertical-align: baseline;&quot;&gt;The Mentorship for Early Career Scholars in Web Archive Studies workshop started with participants introducing themselves. There were PhD students, postdocs, and mentors. Then, participants wrote down questions they had about web archive studies. The mentors characterized the questions into similar topics such as ethics, publishing, and methods. Then because the group was so large, we split into two groups for discussion. My group included &lt;/span&gt;&lt;a href=&quot;https://figoblog.org/&quot;&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;color: #1155cc; font-size: 11pt; vertical-align: baseline;&quot;&gt;Emmanuelle Bermès&lt;/span&gt;&lt;/a&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;font-size: 11pt; vertical-align: baseline;&quot;&gt;, &lt;/span&gt;&lt;a href=&quot;https://www.rug.nl/staff/s.i.aasman/?lang=en&quot;&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;color: #1155cc; font-size: 11pt; vertical-align: baseline;&quot;&gt;Susan Aasman&lt;/span&gt;&lt;/a&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;font-size: 11pt; vertical-align: baseline;&quot;&gt;, &lt;/span&gt;&lt;a href=&quot;https://github.com/Klindten&quot;&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;color: #1155cc; font-size: 11pt; vertical-align: baseline;&quot;&gt;Anders Klindt Myrvoll&lt;/span&gt;&lt;/a&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;font-size: 11pt; vertical-align: baseline;&quot;&gt;, &lt;/span&gt;&lt;a href=&quot;https://www.mediacoop.uni-siegen.de/en/members/inga-schuppener/&quot;&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;color: #1155cc; font-size: 11pt; vertical-align: baseline;&quot;&gt;Inga Schuppener&lt;/span&gt;&lt;/a&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;font-size: 11pt; vertical-align: baseline;&quot;&gt; of Siegen University, &lt;/span&gt;&lt;a href=&quot;https://fr.linkedin.com/in/marina-hervieu&quot;&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;color: #1155cc; font-size: 11pt; vertical-align: baseline;&quot;&gt;Marina Hervieu&lt;/span&gt;&lt;/a&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;font-size: 11pt; vertical-align: baseline;&quot;&gt; of &lt;/span&gt;&lt;a href=&quot;https://www.bnf.fr/en&quot;&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;color: #1155cc; font-size: 11pt; vertical-align: baseline;&quot;&gt;BnF&lt;/span&gt;&lt;/a&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;font-size: 11pt; vertical-align: baseline;&quot;&gt; and &lt;/span&gt;&lt;a href=&quot;https://www.chartes.psl.eu/en/research/centre-jean-mabillon/research-projects/skybox-skyblogs-revealed&quot;&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;color: #1155cc; font-size: 11pt; vertical-align: baseline;&quot;&gt;Skybox blogs project&lt;/span&gt;&lt;/a&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;font-size: 11pt; vertical-align: baseline;&quot;&gt;, &lt;/span&gt;&lt;a href=&quot;https://www.sas.ac.uk/people/beatrice-cannelli&quot;&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;color: #1155cc; font-size: 11pt; vertical-align: baseline;&quot;&gt;Beatrice Cannelli&lt;/span&gt;&lt;/a&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;font-size: 11pt; vertical-align: baseline;&quot;&gt; of the &lt;/span&gt;&lt;a href=&quot;https://www.london.ac.uk/&quot;&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;color: #1155cc; font-size: 11pt; vertical-align: baseline;&quot;&gt;University of London&lt;/span&gt;&lt;/a&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;font-size: 11pt; vertical-align: baseline;&quot;&gt;, &lt;/span&gt;&lt;a href=&quot;https://louisravn.wordpress.com/&quot;&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;color: #1155cc; font-size: 11pt; vertical-align: baseline;&quot;&gt;Louis Ravn&lt;/span&gt;&lt;/a&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;font-size: 11pt; vertical-align: baseline;&quot;&gt; of &lt;/span&gt;&lt;a href=&quot;https://www.uva.nl/en&quot;&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;color: #1155cc; font-size: 11pt; vertical-align: baseline;&quot;&gt;University of Amsterdam&lt;/span&gt;&lt;/a&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;font-size: 11pt; vertical-align: baseline;&quot;&gt; and &lt;/span&gt;&lt;a href=&quot;https://github.com/marcusrommel&quot;&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;color: #1155cc; font-size: 11pt; vertical-align: baseline;&quot;&gt;Marcus Rommel&lt;/span&gt;&lt;/a&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;font-size: 11pt; vertical-align: baseline;&quot;&gt; of Siegen University. Some of the topics we discussed include how there are standard guidelines for legal use but not ethical use, how access policies can hinder dissemination, and how to engage with web archive content of underrepresented communities. One of the takeaways of the workshop was that there should be a Zotero list of papers for getting started with web archive research. The organizers of RESAW have attempted this a few times now, and ODU participated in a similar initiative &lt;/span&gt;&lt;a href=&quot;https://ws-dl.blogspot.com/2019/10/2019-10-31-continuing-education-to.html&quot;&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;color: #1155cc; font-size: 11pt; vertical-align: baseline;&quot;&gt;CEDWARC&lt;/span&gt;&lt;/a&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;font-size: 11pt; vertical-align: baseline;&quot;&gt; previously as well.&lt;/span&gt;&lt;/p&gt;&lt;h2 dir=&quot;ltr&quot; style=&quot;line-height: 1.38; margin-bottom: 6pt; margin-top: 18pt;&quot;&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;font-size: 16pt; font-weight: 400; vertical-align: baseline;&quot;&gt;Thursday&lt;/span&gt;&lt;/h2&gt;&lt;h3 dir=&quot;ltr&quot; style=&quot;line-height: 1.38; margin-bottom: 4pt; margin-top: 16pt;&quot;&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;color: #434343; font-size: 14pt; font-weight: 400; vertical-align: baseline;&quot;&gt;Keynote: Conference Opening Session&lt;/span&gt;&lt;/h3&gt;&lt;br /&gt;&lt;p dir=&quot;ltr&quot; style=&quot;line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;&quot;&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;font-size: 11pt; vertical-align: baseline;&quot;&gt;The opening keynote by &lt;/span&gt;&lt;a href=&quot;https://pure.au.dk/portal/en/persons/nb%40cc.au.dk&quot;&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;color: #1155cc; font-size: 11pt; vertical-align: baseline;&quot;&gt;Niels Brügger&lt;/span&gt;&lt;/a&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;font-size: 11pt; vertical-align: baseline;&quot;&gt; traced the history of the RESAW workshop. He also introduced the demographics of participants, which heavily skews European. There are 95 registered participants from 11 countries, and 40 presentations over the two and a half days of the conference. &lt;/span&gt;&lt;/p&gt;&lt;br /&gt;&lt;div class=&quot;separator&quot; style=&quot;clear: both; text-align: center;&quot;&gt;&lt;a href=&quot;https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjrfqpRmSnPZ1MvACRGzVp0yrRPGXfkXU0gNl8zpgIlEVLgqiA-M5hIEC_-lwx790LpG3dt7CkOFmmOIttqZdyGs5rUA7dY1ltxLU8Xeavgz0g2hor2kuxZzUwlvHdVrXoImLSXLpeJI58rXz4ElHVPPBVmzpqWzZ2F7X5WAsDtNXBOfwTbQiTV8MF4Yic3/s836/blog2.png&quot; style=&quot;margin-left: 1em; margin-right: 1em;&quot;&gt;&lt;img border=&quot;0&quot; height=&quot;228&quot; src=&quot;https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjrfqpRmSnPZ1MvACRGzVp0yrRPGXfkXU0gNl8zpgIlEVLgqiA-M5hIEC_-lwx790LpG3dt7CkOFmmOIttqZdyGs5rUA7dY1ltxLU8Xeavgz0g2hor2kuxZzUwlvHdVrXoImLSXLpeJI58rXz4ElHVPPBVmzpqWzZ2F7X5WAsDtNXBOfwTbQiTV8MF4Yic3/w640-h228/blog2.png&quot; width=&quot;640&quot; /&gt;&lt;/a&gt;&lt;/div&gt;&lt;br /&gt;&lt;p dir=&quot;ltr&quot; style=&quot;line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt; text-align: center;&quot;&gt;&lt;br /&gt;&lt;/p&gt;&lt;p dir=&quot;ltr&quot; style=&quot;line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt; text-align: center;&quot;&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;font-size: 11pt; vertical-align: baseline;&quot;&gt;RESAW 2025 attendees came from 11 countries.&lt;/span&gt;&lt;/p&gt;&lt;br /&gt;&lt;br /&gt;&lt;div class=&quot;separator&quot; style=&quot;clear: both; text-align: center;&quot;&gt;&lt;a href=&quot;https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEg7FiI9ZABwxgb5KrPB6Uhyphenhyphen4tBmgl3VYBJoBIuEfbBdYcOD5NqNy4gXpAbKvugp2HiC6BnsOc6IMyQodMqFBDhxiQsyAmKdwdPAZjhpdr_rTyDWhzIO8Lr8QR_g4jqLxa3-JTRHC4rlk6UYzqzzOJS4IOVshO3A9pYf3A4YOlY_2H72BrrsYKIr_TEfOHnt/s773/blog3.png&quot; style=&quot;margin-left: 1em; margin-right: 1em;&quot;&gt;&lt;img border=&quot;0&quot; height=&quot;468&quot; src=&quot;https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEg7FiI9ZABwxgb5KrPB6Uhyphenhyphen4tBmgl3VYBJoBIuEfbBdYcOD5NqNy4gXpAbKvugp2HiC6BnsOc6IMyQodMqFBDhxiQsyAmKdwdPAZjhpdr_rTyDWhzIO8Lr8QR_g4jqLxa3-JTRHC4rlk6UYzqzzOJS4IOVshO3A9pYf3A4YOlY_2H72BrrsYKIr_TEfOHnt/w640-h468/blog3.png&quot; width=&quot;640&quot; /&gt;&lt;/a&gt;&lt;/div&gt;&lt;br /&gt;&lt;p dir=&quot;ltr&quot; style=&quot;line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt; text-align: center;&quot;&gt;&lt;br /&gt;&lt;/p&gt;&lt;p dir=&quot;ltr&quot; style=&quot;line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt; text-align: center;&quot;&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;font-size: 11pt; vertical-align: baseline;&quot;&gt;RESAW 2025 attendees primarily live in Europe.&lt;/span&gt;&lt;/p&gt;&lt;br /&gt;&lt;p dir=&quot;ltr&quot; style=&quot;line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;&quot;&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;font-size: 11pt; vertical-align: baseline;&quot;&gt;Next, &lt;/span&gt;&lt;a href=&quot;https://www.uni-siegen.de/phil/medienwissenschaft/personal/lehrende/gerlitz_carolin/&quot;&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;color: #1155cc; font-size: 11pt; vertical-align: baseline;&quot;&gt;Carolin Gerlitz&lt;/span&gt;&lt;/a&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;font-size: 11pt; vertical-align: baseline;&quot;&gt; of University of Siegen talked about the host lab, the &lt;/span&gt;&lt;a href=&quot;https://www.mediacoop.uni-siegen.de/en/&quot;&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;color: #1155cc; font-size: 11pt; vertical-align: baseline;&quot;&gt;Media of Cooperation&lt;/span&gt;&lt;/a&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;font-size: 11pt; vertical-align: baseline;&quot;&gt;, funded as a &lt;/span&gt;&lt;a href=&quot;https://en.wikipedia.org/wiki/Collaborative_Research_Centres&quot;&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;color: #1155cc; font-size: 11pt; vertical-align: baseline;&quot;&gt;Collaborative Research Centre&lt;/span&gt;&lt;/a&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;font-size: 11pt; vertical-align: baseline;&quot;&gt;. She introduced the idea of the datafied web, and observed that “The web is full of things that were not meant to be saved.” She questioned whose view is being preserved. She also stated that the true value of data lies in its ability to be remixed, rather than its amount.&lt;/span&gt;&lt;/p&gt;&lt;br /&gt;&lt;div class=&quot;separator&quot; style=&quot;clear: both; text-align: center;&quot;&gt;&lt;a href=&quot;https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgJfGqtWRed_vvTn2nXxkhoqjltVcZcm5YHoUMP6s9izmm-2qZS92WOm9ZWblwzwZ5dLxYRVjLTS9-_65Wg-2uF0KkcBykjV-Y4ajrSmRCQOA4U1MsmJjhdiFTTjtJOp_l5hP96judhn-_vLk4MFNtIYzt3i5Is5ArO8a6diHSDV_tX1xLf1hci8BdkJN9H/s715/blog4.png&quot; style=&quot;margin-left: 1em; margin-right: 1em;&quot;&gt;&lt;img border=&quot;0&quot; height=&quot;370&quot; src=&quot;https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgJfGqtWRed_vvTn2nXxkhoqjltVcZcm5YHoUMP6s9izmm-2qZS92WOm9ZWblwzwZ5dLxYRVjLTS9-_65Wg-2uF0KkcBykjV-Y4ajrSmRCQOA4U1MsmJjhdiFTTjtJOp_l5hP96judhn-_vLk4MFNtIYzt3i5Is5ArO8a6diHSDV_tX1xLf1hci8BdkJN9H/w640-h370/blog4.png&quot; width=&quot;640&quot; /&gt;&lt;/a&gt;&lt;/div&gt;&lt;br /&gt;&lt;p dir=&quot;ltr&quot; style=&quot;line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt; text-align: center;&quot;&gt;&lt;br /&gt;&lt;/p&gt;&lt;p dir=&quot;ltr&quot; style=&quot;line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt; text-align: center;&quot;&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;font-size: 11pt; vertical-align: baseline;&quot;&gt;Carolin Gerlitz gave the opening keynote about the datafied web.&lt;/span&gt;&lt;/p&gt;&lt;br /&gt;&lt;p dir=&quot;ltr&quot; style=&quot;line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;&quot;&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;font-size: 11pt; vertical-align: baseline;&quot;&gt;Finally, &lt;/span&gt;&lt;a href=&quot;https://fr.linkedin.com/in/mara-bertelsen-05247b17&quot;&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;color: #1155cc; font-size: 11pt; vertical-align: baseline;&quot;&gt;Mara Bertelsen&lt;/span&gt;&lt;/a&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;font-size: 11pt; vertical-align: baseline;&quot;&gt; and &lt;/span&gt;&lt;a href=&quot;https://telemme.mmsh.fr/?membres=sophie-gebeil&quot;&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;color: #1155cc; font-size: 11pt; vertical-align: baseline;&quot;&gt;Sophie Gebeil&lt;/span&gt;&lt;/a&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;font-size: 11pt; vertical-align: baseline;&quot;&gt; gave an entertaining skit on the publication of the previous RESAW conference’s proceedings, including props and audience participation. Everyone was entertained by the skit’s portrayal of the challenges of publishing a multinational volume.&lt;/span&gt;&lt;/p&gt;&lt;h3 dir=&quot;ltr&quot; style=&quot;line-height: 1.38; margin-bottom: 4pt; margin-top: 16pt;&quot;&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;color: #434343; font-size: 14pt; font-weight: 400; vertical-align: baseline;&quot;&gt;Roundtable: The Datafied Web&lt;/span&gt;&lt;/h3&gt;&lt;br /&gt;&lt;p dir=&quot;ltr&quot; style=&quot;line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;&quot;&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;font-size: 11pt; vertical-align: baseline;&quot;&gt;The opening roundtable on the datafied web started with each of the five panelists - &lt;/span&gt;&lt;a href=&quot;https://www.uni-siegen.de/phil/medienwissenschaft/personal/lehrende/giessmann_sebastian/&quot;&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;color: #1155cc; font-size: 11pt; vertical-align: baseline;&quot;&gt;Sebastian Gießmann&lt;/span&gt;&lt;/a&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;font-size: 11pt; vertical-align: baseline;&quot;&gt;, &lt;/span&gt;&lt;a href=&quot;https://www.tomandmaria.com/Tom/Home&quot;&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;color: #1155cc; font-size: 11pt; vertical-align: baseline;&quot;&gt;Thomas Haigh&lt;/span&gt;&lt;/a&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;font-size: 11pt; vertical-align: baseline;&quot;&gt;, Carolin Gerlitz, &lt;/span&gt;&lt;a href=&quot;https://www.annehelmond.nl/&quot;&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;color: #1155cc; font-size: 11pt; vertical-align: baseline;&quot;&gt;Anne Helmond&lt;/span&gt;&lt;/a&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;font-size: 11pt; vertical-align: baseline;&quot;&gt;, and&lt;/span&gt;&lt;a href=&quot;https://www.bareikyte.net/&quot;&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;color: #1155cc; font-size: 11pt; vertical-align: baseline;&quot;&gt; Miglė Bareikytė&lt;/span&gt;&lt;/a&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;font-size: 11pt; vertical-align: baseline;&quot;&gt; - introducing their view of the datafied web. Sebastian Giessmann shared his view related to his research on the history of online payments that datafication is a result of capitalism. Next, Thomas Haigh talked about how database management systems are transitioning towards NoSQL to accommodate vector (LLM) and graph (social media) models. He stated that there was never a web that wasn’t datafied, and that it matters how data is stored because web archives capture traces of the web but it’s not the original data.&lt;/span&gt;&lt;/p&gt;&lt;br /&gt;&lt;p dir=&quot;ltr&quot; style=&quot;line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;&quot;&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;font-size: 11pt; vertical-align: baseline;&quot;&gt;Carolin Gerlitz followed by asking what is being measured on the web, how, and who is doing the measuring. She talked about &lt;/span&gt;&lt;a href=&quot;https://en.wikipedia.org/wiki/Web_counter&quot;&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;color: #1155cc; font-size: 11pt; vertical-align: baseline;&quot;&gt;page counters&lt;/span&gt;&lt;/a&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;font-size: 11pt; vertical-align: baseline;&quot;&gt; on the web transitioning to social media reactions and further to LLMs learning from our input. Next, Anne Helmond talked about data in the context of blogs and her work with Carolin on social media reactions. Finally, Miglė Bareikytė, who researches war sensing, talked about her work with archiving Telegram chats and channels in the context of the &lt;/span&gt;&lt;a href=&quot;https://en.wikipedia.org/wiki/Russian_invasion_of_Ukraine&quot;&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;color: #1155cc; font-size: 11pt; vertical-align: baseline;&quot;&gt;Russian invasion of Ukraine&lt;/span&gt;&lt;/a&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;font-size: 11pt; vertical-align: baseline;&quot;&gt;, as 70% of Ukrainians use the software. She stated that an archive that is not used is easily forgotten, and that once users know they are being archived, it changes how they behave.&lt;/span&gt;&lt;/p&gt;&lt;br /&gt;&lt;p dir=&quot;ltr&quot; style=&quot;line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;&quot;&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;font-size: 11pt; vertical-align: baseline;&quot;&gt;Questions for the panel included how to study power relations in the context of the datafied web, and what kind of data the panel would like to have from the current web. The panel gave the &lt;/span&gt;&lt;a href=&quot;https://archive.org/details/COVID-19_Apps&quot;&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;color: #1155cc; font-size: 11pt; vertical-align: baseline;&quot;&gt;Covid-19 app collection at the Internet Archive&lt;/span&gt;&lt;/a&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;font-size: 11pt; vertical-align: baseline;&quot;&gt; as an example of modern archived data.&lt;/span&gt;&lt;/p&gt;&lt;h3 dir=&quot;ltr&quot; style=&quot;line-height: 1.38; margin-bottom: 4pt; margin-top: 16pt;&quot;&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;color: #434343; font-size: 14pt; font-weight: 400; vertical-align: baseline;&quot;&gt;Session: Web Archives Practices&lt;/span&gt;&lt;/h3&gt;&lt;br /&gt;&lt;p dir=&quot;ltr&quot; style=&quot;line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;&quot;&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;font-size: 11pt; vertical-align: baseline;&quot;&gt;The “web archives practices” session opened with Vasco Rato of the &lt;/span&gt;&lt;a href=&quot;https://arquivo.pt&quot;&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;color: #1155cc; font-size: 11pt; vertical-align: baseline;&quot;&gt;Portuguese Web Archive&lt;/span&gt;&lt;/a&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;font-size: 11pt; vertical-align: baseline;&quot;&gt; presenting “Bulk access to web-archived data using APIs”. The Portuguese Web Archive has &lt;/span&gt;&lt;a href=&quot;https://arquivo.pt/api&quot;&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;color: #1155cc; font-size: 11pt; vertical-align: baseline;&quot;&gt;four public APIs&lt;/span&gt;&lt;/a&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;font-size: 11pt; vertical-align: baseline;&quot;&gt;: full text search, image search, CDX, and Memento. They stated that the archive now holds 1.4 petabytes of information, and approximately half of the requests come from APIs now. Some examples of projects that have been built using the APIs are the &lt;/span&gt;&lt;a href=&quot;https://arxiv.org/abs/2402.12969&quot;&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;color: #1155cc; font-size: 11pt; vertical-align: baseline;&quot;&gt;Gloria Portuguese LLM&lt;/span&gt;&lt;/a&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;font-size: 11pt; vertical-align: baseline;&quot;&gt;, and the &lt;/span&gt;&lt;a href=&quot;https://sobre.arquivo.pt/en/meet-the-winners-of-the-arquivo-pt-award-2023/&quot;&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;color: #1155cc; font-size: 11pt; vertical-align: baseline;&quot;&gt;1st place Arquivo.pt 2023 Awards winner&lt;/span&gt;&lt;/a&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;font-size: 11pt; vertical-align: baseline;&quot;&gt; project “Viajar no tempo sobre carris” which aligned CDX schedule data with full text news data.&lt;/span&gt;&lt;/p&gt;&lt;br /&gt;&lt;p dir=&quot;ltr&quot; style=&quot;line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;&quot;&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;font-size: 11pt; vertical-align: baseline;&quot;&gt;Next, &lt;/span&gt;&lt;a href=&quot;https://www.ugent.be/mict/en/team/researchers/eveline-vlassenroot.htm&quot;&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;color: #1155cc; font-size: 11pt; vertical-align: baseline;&quot;&gt;Eveline Vlassenroot&lt;/span&gt;&lt;/a&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;font-size: 11pt; vertical-align: baseline;&quot;&gt; of &lt;/span&gt;&lt;a href=&quot;https://www.ugent.be/en&quot;&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;color: #1155cc; font-size: 11pt; vertical-align: baseline;&quot;&gt;Ghent University&lt;/span&gt;&lt;/a&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;font-size: 11pt; vertical-align: baseline;&quot;&gt; presented, “Navigating the Datafied Web: User requirements and literacy with web archives.” Belgium does not currently have a web archive, so the &lt;/span&gt;&lt;a href=&quot;https://www.kbr.be/en/projects/belgicaweb/&quot;&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;color: #1155cc; font-size: 11pt; vertical-align: baseline;&quot;&gt;Belgicaweb&lt;/span&gt;&lt;/a&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;font-size: 11pt; vertical-align: baseline;&quot;&gt; is a three year project to develop an archive, with &lt;/span&gt;&lt;a href=&quot;https://www.go-fair.org/fair-principles/&quot;&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;color: #1155cc; font-size: 11pt; vertical-align: baseline;&quot;&gt;FAIR principles&lt;/span&gt;&lt;/a&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;font-size: 11pt; vertical-align: baseline;&quot;&gt; in mind. She &lt;/span&gt;&lt;a href=&quot;https://bit.ly/surveyBelgicaWeb&quot;&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;color: #1155cc; font-size: 11pt; vertical-align: baseline;&quot;&gt;conducted a user study&lt;/span&gt;&lt;/a&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;font-size: 11pt; vertical-align: baseline;&quot;&gt; with mixed methods. She found that users want curated collections with transparent documentation about selection, as well as search interfaces. One of the main differences in this user study compared to past web archive user studies is that users are requesting &lt;/span&gt;&lt;a href=&quot;https://arxiv.org/abs/1803.09010&quot;&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;color: #1155cc; font-size: 11pt; vertical-align: baseline;&quot;&gt;datasheets&lt;/span&gt;&lt;/a&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;font-size: 11pt; vertical-align: baseline;&quot;&gt; and APIs. She is working on a web archive user literacy framework and researcher playbook including ethical guidelines for her PhD work.&lt;/span&gt;&lt;/p&gt;&lt;br /&gt;&lt;p dir=&quot;ltr&quot; style=&quot;line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;&quot;&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;font-size: 11pt; vertical-align: baseline;&quot;&gt;Finally, &lt;/span&gt;&lt;a href=&quot;https://uk.linkedin.com/in/helena-byrne-01a888104&quot;&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;color: #1155cc; font-size: 11pt; vertical-align: baseline;&quot;&gt;Helena Byrne&lt;/span&gt;&lt;/a&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;font-size: 11pt; vertical-align: baseline;&quot;&gt; of the &lt;/span&gt;&lt;a href=&quot;https://www.bl.uk/&quot;&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;color: #1155cc; font-size: 11pt; vertical-align: baseline;&quot;&gt;British Library&lt;/span&gt;&lt;/a&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;font-size: 11pt; vertical-align: baseline;&quot;&gt; presented “Lessons learnt from preparing collections as data: the UK Web Archive experience.” The &lt;/span&gt;&lt;a href=&quot;https://resaw2023.sciencesconf.org/435072/document&quot;&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;color: #1155cc; font-size: 11pt; vertical-align: baseline;&quot;&gt;initial framework&lt;/span&gt;&lt;/a&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;font-size: 11pt; vertical-align: baseline;&quot;&gt; was presented at the previous RESAW conference in 2023. Now, there is a &lt;/span&gt;&lt;a href=&quot;https://blogs.bl.uk/webarchive/2024/11/datasheets-for-web-archives-toolkit-is-now-live.html&quot;&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;color: #1155cc; font-size: 11pt; vertical-align: baseline;&quot;&gt;toolkit&lt;/span&gt;&lt;/a&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;font-size: 11pt; vertical-align: baseline;&quot;&gt; published, and an online self-paced &lt;/span&gt;&lt;a href=&quot;https://reshaped.sas.ac.uk&quot;&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;color: #1155cc; font-size: 11pt; vertical-align: baseline;&quot;&gt;course&lt;/span&gt;&lt;/a&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;font-size: 11pt; vertical-align: baseline;&quot;&gt; is also available through the University of London. The &lt;/span&gt;&lt;a href=&quot;https://blogs.bl.uk/webarchive/&quot;&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;color: #1155cc; font-size: 11pt; vertical-align: baseline;&quot;&gt;UK Web Archive&lt;/span&gt;&lt;/a&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;font-size: 11pt; vertical-align: baseline;&quot;&gt; (still &lt;/span&gt;&lt;a href=&quot;https://en.wikipedia.org/wiki/British_Library_cyberattack&quot;&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;color: #1155cc; font-size: 11pt; vertical-align: baseline;&quot;&gt;offline&lt;/span&gt;&lt;/a&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;font-size: 11pt; vertical-align: baseline;&quot;&gt;) has applied the framework to 10 collections, released under a creative commons license.&lt;/span&gt;&lt;/p&gt;&lt;br /&gt;&lt;h3 dir=&quot;ltr&quot; style=&quot;line-height: 1.38; margin-bottom: 4pt; margin-top: 16pt;&quot;&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;color: #434343; font-size: 14pt; font-weight: 400; vertical-align: baseline;&quot;&gt;Session: Social Media and APIs&lt;/span&gt;&lt;/h3&gt;&lt;br /&gt;&lt;p dir=&quot;ltr&quot; style=&quot;line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;&quot;&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;font-size: 11pt; vertical-align: baseline;&quot;&gt;In the social media and APIs session, &lt;/span&gt;&lt;a href=&quot;https://katiemackinnon.xyz/&quot;&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;color: #1155cc; font-size: 11pt; vertical-align: baseline;&quot;&gt;Katie Mackinnon&lt;/span&gt;&lt;/a&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;font-size: 11pt; vertical-align: baseline;&quot;&gt; of the &lt;/span&gt;&lt;a href=&quot;https://www.ku.dk/en&quot;&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;color: #1155cc; font-size: 11pt; vertical-align: baseline;&quot;&gt;University of Copenhagen&lt;/span&gt;&lt;/a&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;font-size: 11pt; vertical-align: baseline;&quot;&gt; presented, “Robots.txt and A History of Consent for Web Data Capture.” She talked about the development of robots.txt by Martijn Koster, and how it is a gentleman’s agreement to follow it. She traced policies at the Internet Archive around robots.txt, culminating in the &lt;/span&gt;&lt;a href=&quot;https://blog.archive.org/2017/04/17/robots-txt-meant-for-search-engines-dont-work-well-for-web-archives/&quot;&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;color: #1155cc; font-size: 11pt; vertical-align: baseline;&quot;&gt;2017 blog post &lt;/span&gt;&lt;/a&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;font-size: 11pt; vertical-align: baseline;&quot;&gt;that they will no longer follow it. She also showed a &lt;/span&gt;&lt;a href=&quot;https://www.robotstxt.org/robots-webcrawler-mbox.txt&quot;&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;color: #1155cc; font-size: 11pt; vertical-align: baseline;&quot;&gt;1995 post on the webcrawler mailing list &lt;/span&gt;&lt;/a&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;font-size: 11pt; vertical-align: baseline;&quot;&gt;asking for a zip file of the entire internet for AI purposes, which the audience found humorous. She also talked about the &lt;/span&gt;&lt;a href=&quot;https://arxiv.org/abs/2407.14933&quot;&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;color: #1155cc; font-size: 11pt; vertical-align: baseline;&quot;&gt;study by Longpre et al.&lt;/span&gt;&lt;/a&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;font-size: 11pt; vertical-align: baseline;&quot;&gt; which traced how website managers are using robots.txt to restrict access to their content by crawlers for LLMs.&lt;/span&gt;&lt;/p&gt;&lt;br /&gt;&lt;p dir=&quot;ltr&quot; style=&quot;line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;&quot;&gt;&lt;a href=&quot;https://www.uni-paderborn.de/person/72684&quot;&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;color: #1155cc; font-size: 11pt; vertical-align: baseline;&quot;&gt;Christian Schulz&lt;/span&gt;&lt;/a&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;font-size: 11pt; vertical-align: baseline;&quot;&gt; of &lt;/span&gt;&lt;a href=&quot;https://www.uni-paderborn.de/en/&quot;&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;color: #1155cc; font-size: 11pt; vertical-align: baseline;&quot;&gt;Paderborn University&lt;/span&gt;&lt;/a&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;font-size: 11pt; vertical-align: baseline;&quot;&gt; next presented “On Reciprocity - Algorithmic Interweavings between PageRank and Social Media.” He talked about comparable systems for ranking authority of social media, such as &lt;/span&gt;&lt;a href=&quot;https://en.wikipedia.org/wiki/Pingback&quot;&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;color: #1155cc; font-size: 11pt; vertical-align: baseline;&quot;&gt;pingbacks&lt;/span&gt;&lt;/a&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;font-size: 11pt; vertical-align: baseline;&quot;&gt;, and also talked about how users can manipulate their authority ranking on social media, including the “like4like” strategy. The final presenter in the session was &lt;/span&gt;&lt;a href=&quot;http://www.christinaschinzel.de/&quot;&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;color: #1155cc; font-size: 11pt; vertical-align: baseline;&quot;&gt;Christina Schinzel&lt;/span&gt;&lt;/a&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;font-size: 11pt; vertical-align: baseline;&quot;&gt; of &lt;/span&gt;&lt;a href=&quot;https://www.uni-weimar.de/en/university/start/&quot;&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;color: #1155cc; font-size: 11pt; vertical-align: baseline;&quot;&gt;Bauhaus University, Weimar&lt;/span&gt;&lt;/a&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;font-size: 11pt; vertical-align: baseline;&quot;&gt;. She presented, “APIs. How their role in the history of computing and their software engineering principles shape the modern datafied web.” She traced the appearance of the term API to the paper, “&lt;/span&gt;&lt;a href=&quot;https://dl.acm.org/doi/10.1145/1476589.1476661&quot;&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;color: #1155cc; font-size: 11pt; vertical-align: baseline;&quot;&gt;Data structures and techniques for remote computer graphics&lt;/span&gt;&lt;/a&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;font-size: 11pt; vertical-align: baseline;&quot;&gt;.” She talked about the history of APIs on the web, and how they have led to interconnected ecosystems.&lt;/span&gt;&lt;/p&gt;&lt;br /&gt;&lt;h2 dir=&quot;ltr&quot; style=&quot;line-height: 1.38; margin-bottom: 6pt; margin-top: 18pt;&quot;&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;font-size: 16pt; font-weight: 400; vertical-align: baseline;&quot;&gt;Friday&lt;/span&gt;&lt;/h2&gt;&lt;h3 dir=&quot;ltr&quot; style=&quot;line-height: 1.38; margin-bottom: 4pt; margin-top: 16pt;&quot;&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;color: #434343; font-size: 14pt; font-weight: 400; vertical-align: baseline;&quot;&gt;Session: Web Archives Practices&lt;/span&gt;&lt;/h3&gt;&lt;br /&gt;&lt;div class=&quot;separator&quot; style=&quot;clear: both; text-align: center;&quot;&gt;&lt;a href=&quot;https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgrTXxx1WLjLi2JBQd9tHrZRmTce8F37ZGQxW8Hx64dlOc1aEL4HuDQ67FXG2DJp8tLNjAz13Je5j3p3qqNtQ7zGNKnCy2gYKEJo4p77uQkxrLbzpmrkyuUsSFxncaHrk8pr0dhjU2nrkIrppsK7DHCUFkWYCatlrfppMuX4DUooYvAjPKX0VmHTirHSX1b/s952/blog5.png&quot; style=&quot;margin-left: 1em; margin-right: 1em;&quot;&gt;&lt;img border=&quot;0&quot; height=&quot;480&quot; src=&quot;https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgrTXxx1WLjLi2JBQd9tHrZRmTce8F37ZGQxW8Hx64dlOc1aEL4HuDQ67FXG2DJp8tLNjAz13Je5j3p3qqNtQ7zGNKnCy2gYKEJo4p77uQkxrLbzpmrkyuUsSFxncaHrk8pr0dhjU2nrkIrppsK7DHCUFkWYCatlrfppMuX4DUooYvAjPKX0VmHTirHSX1b/w640-h480/blog5.png&quot; width=&quot;640&quot; /&gt;&lt;/a&gt;&lt;/div&gt;&lt;br /&gt;&lt;p dir=&quot;ltr&quot; style=&quot;line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;&quot;&gt;&lt;br /&gt;&lt;/p&gt;&lt;p dir=&quot;ltr&quot; style=&quot;line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt; text-align: center;&quot;&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;font-size: 11pt; vertical-align: baseline;&quot;&gt;Lesley Frew presenting “Temporally Extending Existing Web Archive Collections for Longitudinal Analysis.” Photo courtesy of Eveline Vlassenroot.&lt;/span&gt;&lt;/p&gt;&lt;br /&gt;&lt;p dir=&quot;ltr&quot; style=&quot;line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;&quot;&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;font-size: 11pt; vertical-align: baseline;&quot;&gt;In the Web Archives Practices session, I presented our work, “Temporally Extending Existing Web Archive Collections for Longitudinal Analysis.” Our pre-print is available on &lt;/span&gt;&lt;a href=&quot;https://arxiv.org/abs/2505.24091&quot;&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;color: #1155cc; font-size: 11pt; vertical-align: baseline;&quot;&gt;arXiv&lt;/span&gt;&lt;/a&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;font-size: 11pt; vertical-align: baseline;&quot;&gt;. We extended the &lt;/span&gt;&lt;a href=&quot;https://doi.org/10.1371/journal.pone.0246450&quot;&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;color: #1155cc; font-size: 11pt; vertical-align: baseline;&quot;&gt;EDGI 2016-2020 federal environmental changes dataset&lt;/span&gt;&lt;/a&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;font-size: 11pt; vertical-align: baseline;&quot;&gt; back to 2008 to analyze whether the terms being deleted by the first Trump administration were added by the Obama administration. We described our methodology, which used past web crawling, the &lt;/span&gt;&lt;a href=&quot;https://eotarchive.org/&quot;&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;color: #1155cc; font-size: 11pt; vertical-align: baseline;&quot;&gt;End of Term&lt;/span&gt;&lt;/a&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;font-size: 11pt; vertical-align: baseline;&quot;&gt; 2008 dataset, and the &lt;/span&gt;&lt;a href=&quot;https://github.com/internetarchive/wayback/tree/master/wayback-cdx-server&quot;&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;color: #1155cc; font-size: 11pt; vertical-align: baseline;&quot;&gt;Wayback Machine CDX API&lt;/span&gt;&lt;/a&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;font-size: 11pt; vertical-align: baseline;&quot;&gt;. We found that 87% of the pages with terms deleted by the first Trump administration contained deleted terms originally added by the Obama administration. &lt;/span&gt;&lt;/p&gt;&lt;br /&gt;&lt;p dir=&quot;ltr&quot; style=&quot;line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;&quot;&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;font-size: 11pt; vertical-align: baseline;&quot;&gt;&lt;/span&gt;&lt;/p&gt;&lt;br /&gt;&lt;p dir=&quot;ltr&quot; style=&quot;line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;&quot;&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;font-size: 11pt; vertical-align: baseline;&quot;&gt;Next, &lt;/span&gt;&lt;a href=&quot;https://github.com/aurigandrea&quot;&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;color: #1155cc; font-size: 11pt; vertical-align: baseline;&quot;&gt;Andrea Kocsis&lt;/span&gt;&lt;/a&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;font-size: 11pt; vertical-align: baseline;&quot;&gt; presented, “Engaging audiences with the UK Web Archive: Strategies for general readers, data users, and the digitally curious.” She worked with three groups of users on a computational skills spectrum: general readers do not have experience with web archives, data users have heavy experience with computational use of web archives, and in between are the digitally curious, for example users with some Jupyter notebook experience without specific web archive data experience. She also worked on a first of its kind physical web archive exhibit called “Digital Ghosts - Exploring Scotland’s Heritage on the Web.” Andrea’s research can be found on &lt;/span&gt;&lt;a href=&quot;https://github.com/aurigandrea/NLS-Fellowship-2024/&quot;&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;color: #1155cc; font-size: 11pt; vertical-align: baseline;&quot;&gt;GitHub&lt;/span&gt;&lt;/a&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;font-size: 11pt; vertical-align: baseline;&quot;&gt;.&lt;/span&gt;&lt;/p&gt;&lt;br /&gt;&lt;p dir=&quot;ltr&quot; style=&quot;line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;&quot;&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;font-size: 11pt; vertical-align: baseline;&quot;&gt;Finally, Ricardo Basílio presented “Seed lists on themes and events on Arquivo.pt: a curious starting point for discovering a web archive.” He talked about his methodology for obtaining lists of seeds, which includes both automated (Bing) and manual (individual and community) seed list development. He pointed out that automated methods like scraping a seed URL for additional URLs do not result in correct seed lists for very specific collections, such as a list of Portuguese art galleries, since external links wouldn’t be restricted to only Portuguese art galleries, which necessitates manual curation. WSDL has also done previous work on seeds and collections, including &lt;/span&gt;&lt;a href=&quot;https://ws-dl.blogspot.com/2020/09/2020-09-28-my-report-card-to-my-mother.html&quot;&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;color: #1155cc; font-size: 11pt; vertical-align: baseline;&quot;&gt;scraping social media for micro collections by Alexander Nwala&lt;/span&gt;&lt;/a&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;font-size: 11pt; vertical-align: baseline;&quot;&gt; and &lt;/span&gt;&lt;a href=&quot;https://ws-dl.blogspot.com/2021/09/2021-09-01-from-student-to-researcher.html&quot;&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;color: #1155cc; font-size: 11pt; vertical-align: baseline;&quot;&gt;summarizing collections by Shawn Jones&lt;/span&gt;&lt;/a&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;font-size: 11pt; vertical-align: baseline;&quot;&gt;.&lt;/span&gt;&lt;/p&gt;&lt;br /&gt;&lt;br /&gt;&lt;div class=&quot;separator&quot; style=&quot;clear: both; text-align: center;&quot;&gt;&lt;a href=&quot;https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiGCp5tvM6hrA1pQrc0FQXvcU2BwPBSQ-zLtD7Oqo3C1nTimrcIimMTM-gp4PGk8jdUib9Iv8SQxfSWoFKUNH0Hc4EzrP8JA617M412DKLccYY6uwhFVzfTOHo-YpsSXzhxRTtp8cvT6uPSMIua6qZg4Sy7jhR2ama6TL1DBhO0eIXbxRbGrpsFHj2ZL2T8/s877/blog6.png&quot; style=&quot;margin-left: 1em; margin-right: 1em;&quot;&gt;&lt;img border=&quot;0&quot; height=&quot;430&quot; src=&quot;https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiGCp5tvM6hrA1pQrc0FQXvcU2BwPBSQ-zLtD7Oqo3C1nTimrcIimMTM-gp4PGk8jdUib9Iv8SQxfSWoFKUNH0Hc4EzrP8JA617M412DKLccYY6uwhFVzfTOHo-YpsSXzhxRTtp8cvT6uPSMIua6qZg4Sy7jhR2ama6TL1DBhO0eIXbxRbGrpsFHj2ZL2T8/w640-h430/blog6.png&quot; width=&quot;640&quot; /&gt;&lt;/a&gt;&lt;/div&gt;&lt;br /&gt;&lt;p dir=&quot;ltr&quot; style=&quot;line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;&quot;&gt;&lt;br /&gt;&lt;/p&gt;&lt;p dir=&quot;ltr&quot; style=&quot;line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt; text-align: center;&quot;&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;font-size: 11pt; vertical-align: baseline;&quot;&gt;Andrea Kocsis, Ricardo Basílio, and Lesley Frew at the Web Archives Practices session discussion. Photo courtesy of Eveline Vlassenroot.&lt;/span&gt;&lt;/p&gt;&lt;br /&gt;&lt;p dir=&quot;ltr&quot; style=&quot;line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;&quot;&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;font-size: 11pt; vertical-align: baseline;&quot;&gt;The way that the sessions are structured at RESAW is that all presenters give their presentations back-to-back, and then all presenters answer questions together in a panel. Here are some of the questions that the audience had for my presentation.&lt;/span&gt;&lt;/p&gt;&lt;ul style=&quot;margin-bottom: 0px; margin-top: 0px; padding-inline-start: 48px;&quot;&gt;&lt;li dir=&quot;ltr&quot; style=&quot;font-family: Arial, sans-serif; font-size: 11pt; vertical-align: baseline; white-space: pre;&quot;&gt;&lt;p dir=&quot;ltr&quot; style=&quot;line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;&quot;&gt;&lt;span style=&quot;font-size: 11pt; vertical-align: baseline;&quot;&gt;Q: How did we calculate the deleted terms (were we restricted to only the EDGI 56 terms)?  A: We used the change-text search interface we developed and presented at &lt;/span&gt;&lt;a href=&quot;https://ws-dl.blogspot.com/2023/07/2023-07-26-acmieee-joint-conference-on.html&quot;&gt;&lt;span style=&quot;color: #1155cc; font-size: 11pt; vertical-align: baseline;&quot;&gt;JCDL 2023&lt;/span&gt;&lt;/a&gt;&lt;span style=&quot;font-size: 11pt; vertical-align: baseline;&quot;&gt; (“&lt;/span&gt;&lt;a href=&quot;https://doi.ieeecomputersociety.org/10.1109/JCDL57899.2023.00021&quot;&gt;&lt;span style=&quot;color: #1155cc; font-size: 11pt; vertical-align: baseline;&quot;&gt;Making Changes in Webpages Discoverable: A Change-Text Search Interface for Web Archives&lt;/span&gt;&lt;/a&gt;&lt;span style=&quot;font-size: 11pt; vertical-align: baseline;&quot;&gt;”), so we calculated all deleted terms, not just those tracked by EDGI.&lt;/span&gt;&lt;/p&gt;&lt;/li&gt;&lt;li dir=&quot;ltr&quot; style=&quot;font-family: Arial, sans-serif; font-size: 11pt; vertical-align: baseline; white-space: pre;&quot;&gt;&lt;p dir=&quot;ltr&quot; style=&quot;line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;&quot;&gt;&lt;span style=&quot;font-size: 11pt; vertical-align: baseline;&quot;&gt;Q: Are the changes representative of what is expected under a given presidential administration? A: No, we found that the Trump administration deletes up to ten times the amount of pages compared to other recent presidential administrations (“&lt;/span&gt;&lt;a href=&quot;https://ws-dl.blogspot.com/2025/03/2025-03-27-establishing-baseline-by.html&quot;&gt;&lt;span style=&quot;color: #1155cc; font-size: 11pt; vertical-align: baseline;&quot;&gt;Establishing a Baseline by Administration for the Takedown of US Government Webpages using Web Archives&lt;/span&gt;&lt;/a&gt;&lt;span style=&quot;font-size: 11pt; vertical-align: baseline;&quot;&gt;” blog post).&lt;/span&gt;&lt;/p&gt;&lt;/li&gt;&lt;/ul&gt;&lt;h3 dir=&quot;ltr&quot; style=&quot;line-height: 1.38; margin-bottom: 4pt; margin-top: 16pt;&quot;&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;color: #434343; font-size: 14pt; font-weight: 400; vertical-align: baseline;&quot;&gt;Session: My PhD in 5 Minutes&lt;/span&gt;&lt;/h3&gt;&lt;br /&gt;&lt;p dir=&quot;ltr&quot; style=&quot;line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;&quot;&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;font-size: 11pt; vertical-align: baseline;&quot;&gt;In the My PhD in 5 Minutes session, there were three web archives PhD students who presented their dissertation topics. One of the questions by the chair &lt;/span&gt;&lt;a href=&quot;https://www.sas.ac.uk/people/professor-jane-winters&quot;&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;color: #1155cc; font-size: 11pt; vertical-align: baseline;&quot;&gt;Jane Winters&lt;/span&gt;&lt;/a&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;font-size: 11pt; vertical-align: baseline;&quot;&gt; was to further summarize their PhD to one sentence of what they wanted people to take away. &lt;/span&gt;&lt;a href=&quot;https://www.rug.nl/staff/n.fridzema/?lang=en&quot;&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;color: #1155cc; font-size: 11pt; vertical-align: baseline;&quot;&gt;Nathalie Fridzema&lt;/span&gt;&lt;/a&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;font-size: 11pt; vertical-align: baseline;&quot;&gt; of the &lt;/span&gt;&lt;a href=&quot;https://www.rug.nl/?lang=en&quot;&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;color: #1155cc; font-size: 11pt; vertical-align: baseline;&quot;&gt;University of Groningen&lt;/span&gt;&lt;/a&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;font-size: 11pt; vertical-align: baseline;&quot;&gt; presented, “Before WEB 2.0: A Cultural History of Early Web Practices in the Netherlands from 1994 until 2004.” She wants people to take away the rich history of the Dutch web found in web archives. &lt;/span&gt;&lt;a href=&quot;https://www.kleine-formen.de/team/anya-shchetvina/&quot;&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;color: #1155cc; font-size: 11pt; vertical-align: baseline;&quot;&gt;Anya Shchetvina&lt;/span&gt;&lt;/a&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;font-size: 11pt; vertical-align: baseline;&quot;&gt; of &lt;/span&gt;&lt;a href=&quot;https://www.hu-berlin.de/en&quot;&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;color: #1155cc; font-size: 11pt; vertical-align: baseline;&quot;&gt;Humboldt University&lt;/span&gt;&lt;/a&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;font-size: 11pt; vertical-align: baseline;&quot;&gt; presented, “Manifesting The Web: Network Imaginaries in Manifesto Writing Between the 1980s and the 2020s.” She wants people to take away that manifestos have persisted over time but changed. &lt;/span&gt;&lt;a href=&quot;https://europeannewschool.eu/johanna-hiebl&quot;&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;color: #1155cc; font-size: 11pt; vertical-align: baseline;&quot;&gt;Johanna Hiebl&lt;/span&gt;&lt;/a&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;font-size: 11pt; vertical-align: baseline;&quot;&gt; of &lt;/span&gt;&lt;a href=&quot;https://www.europa-uni.de/en/index.html&quot;&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;color: #1155cc; font-size: 11pt; vertical-align: baseline;&quot;&gt;European University Viadrina&lt;/span&gt;&lt;/a&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;font-size: 11pt; vertical-align: baseline;&quot;&gt; presented, “Battlefield of Truth(s) on Investigative Frontlines: From Data Activism to OSINT Professionalism.” She wants people to take away an open source methodology that is ethical for social sciences. There were no other parallel sessions during this time, so all 100 RESAW attendees came to this session and were able to ask questions to the presenters.&lt;/span&gt;&lt;/p&gt;&lt;br /&gt;&lt;h3 dir=&quot;ltr&quot; style=&quot;line-height: 1.38; margin-bottom: 4pt; margin-top: 16pt;&quot;&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;color: #434343; font-size: 14pt; font-weight: 400; vertical-align: baseline;&quot;&gt;Roundtable: RESAW - The First 10 Years and What’s Next&lt;/span&gt;&lt;/h3&gt;&lt;br /&gt;&lt;p dir=&quot;ltr&quot; style=&quot;line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;&quot;&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;font-size: 11pt; vertical-align: baseline;&quot;&gt;In the RESAW business session roundtable, the panelists (Susan Aasman, Emmanuelle Bermès, &lt;/span&gt;&lt;a href=&quot;https://www.kathrynbrewster.com/&quot;&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;color: #1155cc; font-size: 11pt; vertical-align: baseline;&quot;&gt;Kat Brewster&lt;/span&gt;&lt;/a&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;font-size: 11pt; vertical-align: baseline;&quot;&gt;, &lt;/span&gt;&lt;a href=&quot;https://lab.kb.nl/person/iris-geldermans&quot;&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;color: #1155cc; font-size: 11pt; vertical-align: baseline;&quot;&gt;Iris Geldermans&lt;/span&gt;&lt;/a&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;font-size: 11pt; vertical-align: baseline;&quot;&gt;, &lt;/span&gt;&lt;a href=&quot;https://www.jameshodges.net/&quot;&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;color: #1155cc; font-size: 11pt; vertical-align: baseline;&quot;&gt;James Hodges&lt;/span&gt;&lt;/a&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;font-size: 11pt; vertical-align: baseline;&quot;&gt;, &lt;/span&gt;&lt;a href=&quot;https://www.mediacoop.uni-siegen.de/en/members/seitz-tatjana-m-a/&quot;&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;color: #1155cc; font-size: 11pt; vertical-align: baseline;&quot;&gt;Tatjana Seitz&lt;/span&gt;&lt;/a&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;font-size: 11pt; vertical-align: baseline;&quot;&gt;, and &lt;/span&gt;&lt;a href=&quot;https://lab.kb.nl/person/jesper-verhoef&quot;&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;color: #1155cc; font-size: 11pt; vertical-align: baseline;&quot;&gt;Jesper Verhoef&lt;/span&gt;&lt;/a&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;font-size: 11pt; vertical-align: baseline;&quot;&gt;) engaged with each other and the full conference audience about what is to come. One of the successes they discussed was that there is value in non-US centered conferences in the topics studied. One of the challenges was that because RESAW is currently almost entirely European attendees, they wish to increase geographical diversity to Africa and South Asia. There was also an extensive discussion about having “less” (in the context of what web archives hold), but there were concerns about who decides what is deleted and how, and in a complementary sense who decides what is kept.&lt;/span&gt;&lt;/p&gt;&lt;h2 dir=&quot;ltr&quot; style=&quot;line-height: 1.38; margin-bottom: 6pt; margin-top: 18pt;&quot;&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;font-size: 16pt; font-weight: 400; vertical-align: baseline;&quot;&gt;Session: Conference Closing&lt;/span&gt;&lt;/h2&gt;&lt;div class=&quot;separator&quot; style=&quot;clear: both; text-align: center;&quot;&gt;&lt;a href=&quot;https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEj8PJY538IXJlEkgRBnALVy9qwfq49W-u88UtGFm3-ZmDYZZtr7T5NxB-sLERn7QhD7bvSsn2tBcNJCj-D7UaORTJhfaCOPGgLZTJEevhHTKAg2pH7Ogg8H2Kx6_y44Wf2UeygP2BYmtgKC0U5ugXQyPvLLVLTwWSOlmH4mal3p0iJ9nPoMztGWAXTFDZ7v/s703/blog7.png&quot; style=&quot;margin-left: 1em; margin-right: 1em;&quot;&gt;&lt;img border=&quot;0&quot; height=&quot;408&quot; src=&quot;https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEj8PJY538IXJlEkgRBnALVy9qwfq49W-u88UtGFm3-ZmDYZZtr7T5NxB-sLERn7QhD7bvSsn2tBcNJCj-D7UaORTJhfaCOPGgLZTJEevhHTKAg2pH7Ogg8H2Kx6_y44Wf2UeygP2BYmtgKC0U5ugXQyPvLLVLTwWSOlmH4mal3p0iJ9nPoMztGWAXTFDZ7v/w640-h408/blog7.png&quot; width=&quot;640&quot; /&gt;&lt;/a&gt;&lt;/div&gt;&lt;br /&gt;&lt;p dir=&quot;ltr&quot; style=&quot;line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt; text-align: center;&quot;&gt;&lt;br /&gt;&lt;/p&gt;&lt;p dir=&quot;ltr&quot; style=&quot;line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt; text-align: center;&quot;&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;font-size: 11pt; vertical-align: baseline;&quot;&gt;The RESAW conference is an informal, international, interdisciplinary web archives conference.&lt;/span&gt;&lt;/p&gt;&lt;br /&gt;&lt;p dir=&quot;ltr&quot; style=&quot;line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;&quot;&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;font-size: 11pt; vertical-align: baseline;&quot;&gt;The RESAW conference happens every other year, and the locations are announced four years in advance (so for the next two conferences). RESAW 2027 will be in &lt;/span&gt;&lt;a href=&quot;https://en.wikipedia.org/wiki/Groningen&quot;&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;color: #1155cc; font-size: 11pt; vertical-align: baseline;&quot;&gt;Groningen, Netherlands&lt;/span&gt;&lt;/a&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;font-size: 11pt; vertical-align: baseline;&quot;&gt;, and RESAW 2029 will be hosted by &lt;/span&gt;&lt;a href=&quot;https://www.uni.lu/c2dh-en/&quot;&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;color: #1155cc; font-size: 11pt; vertical-align: baseline;&quot;&gt;C2DH&lt;/span&gt;&lt;/a&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;font-size: 11pt; vertical-align: baseline;&quot;&gt; in &lt;/span&gt;&lt;a href=&quot;https://en.wikipedia.org/wiki/Esch-sur-Alzette&quot;&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;color: #1155cc; font-size: 11pt; vertical-align: baseline;&quot;&gt;Luxembourg&lt;/span&gt;&lt;/a&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;font-size: 11pt; vertical-align: baseline;&quot;&gt;.&lt;/span&gt;&lt;/p&gt;&lt;h2 dir=&quot;ltr&quot; style=&quot;line-height: 1.38; margin-bottom: 6pt; margin-top: 18pt;&quot;&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;font-size: 16pt; font-weight: 400; vertical-align: baseline;&quot;&gt;Conclusion&lt;/span&gt;&lt;/h2&gt;&lt;br /&gt;&lt;p dir=&quot;ltr&quot; style=&quot;line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;&quot;&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;font-size: 11pt; vertical-align: baseline;&quot;&gt;I enjoyed attending RESAW because it brought together 100 interdisciplinary web archive researchers. The interdisciplinary departments represented include digital humanities, media studies, information science, and computer science. While representatives of web archiving organizations were present at the conference, the focus was on individual research rather than products or tools. I liked seeing the different methodologies that non-computational scholars use with web archives for their research. I hope that WSDL can attend this conference regularly in the future.&lt;/span&gt;&lt;/p&gt;&lt;br /&gt;&lt;div class=&quot;separator&quot; style=&quot;clear: both; text-align: center;&quot;&gt;&lt;a href=&quot;https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjstcrsQMwnL015yx9_Dig5K01OwRATilYaqG-xwVo7YJLFicr09s_zurgy3WdvXTaqLEVGaGgJnuqn6OjpGt_KZ4JYOkvFAj3CLmo6iVdjjPLL-U4Y1k7HkTSjOBv1veemnZgt6LEdagJnXtEP8tMdaDvNjbEGPBlPIBZWOowB9vfswC49vfDJjMhRZu9m/s944/blog8.png&quot; style=&quot;margin-left: 1em; margin-right: 1em;&quot;&gt;&lt;img border=&quot;0&quot; height=&quot;292&quot; src=&quot;https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjstcrsQMwnL015yx9_Dig5K01OwRATilYaqG-xwVo7YJLFicr09s_zurgy3WdvXTaqLEVGaGgJnuqn6OjpGt_KZ4JYOkvFAj3CLmo6iVdjjPLL-U4Y1k7HkTSjOBv1veemnZgt6LEdagJnXtEP8tMdaDvNjbEGPBlPIBZWOowB9vfswC49vfDJjMhRZu9m/w640-h292/blog8.png&quot; width=&quot;640&quot; /&gt;&lt;/a&gt;&lt;/div&gt;&lt;br /&gt;&lt;p dir=&quot;ltr&quot; style=&quot;line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt; text-align: center;&quot;&gt;&lt;br /&gt;&lt;/p&gt;&lt;p dir=&quot;ltr&quot; style=&quot;line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt; text-align: center;&quot;&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;font-size: 11pt; vertical-align: baseline;&quot;&gt;River Sieg&lt;/span&gt;&lt;/p&gt;&lt;br /&gt;&lt;p dir=&quot;ltr&quot; style=&quot;line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;&quot;&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;font-size: 11pt; vertical-align: baseline;&quot;&gt;-Lesley&lt;/span&gt;&lt;/p&gt;&lt;div&gt;&lt;span face=&quot;Arial, sans-serif&quot; style=&quot;font-size: 11pt; vertical-align: baseline;&quot;&gt;&lt;br /&gt;&lt;/span&gt;&lt;/div&gt;&lt;/span&gt;</content>
  1629.      <author>
  1630.  <name>Lesley Frew (noreply@blogger.com)</name>
  1631. <uri>https://ws-dl.blogspot.com/</uri>
  1632.      </author>
  1633.    </entry>
  1634.  
  1635.    <entry>
  1636.      <title>LibraryThing (Thingology): Come Join the 2025 Pride Treasure Hunt!</title>
  1637.      <link rel="alternate" type="text/html" href="https://blog.librarything.com/2025/06/come-join-the-2025-pride-treasure-hunt/"/>
  1638.      <id>https://blog.librarything.com/?p=11289</id>
  1639.      <updated>2025-06-17T16:47:00+00:00</updated>
  1640.      <content type="html">&lt;div style=&quot;float: right; margin: 0 0 1em 1em; clear: right;&quot;&gt;&lt;a href=&quot;https://www.librarything.com/hunt/37&quot;&gt;&lt;img alt=&quot;&quot; src=&quot;https://image.librarything.com/pics/hunt/2025-06_512.png&quot; style=&quot;width: 300px;&quot; /&gt;&lt;/a&gt;&lt;/div&gt;
  1641. &lt;p&gt;It’s June, and that means that our annual &lt;a href=&quot;https://www.librarything.com/hunt/37&quot;&gt;Pride Month Treasure Hunt&lt;/a&gt; is back!&lt;/p&gt;
  1642. &lt;p&gt;We’ve scattered a shower of rainbows around the site, and it’s up to you to try and find them all.&lt;/p&gt;
  1643. &lt;ul&gt;
  1644. &lt;li&gt;Decipher the clues and visit the corresponding LibraryThing pages to find a rainbow. Each clue points to a &lt;b&gt;specific page&lt;/b&gt; right here on LibraryThing. Remember, they are not necessarily work pages!&lt;/li&gt;
  1645. &lt;li&gt;If there’s a rainbow on a page, you’ll see a banner at the top of the page.&lt;/li&gt;
  1646. &lt;li&gt;You have just under &lt;b&gt;two weeks&lt;/b&gt; to find all the rainbows (until &lt;b&gt;11:59pm EDT, Monday June 30th&lt;/b&gt;).&lt;/li&gt;
  1647. &lt;li&gt;Come brag about your shower of rainbows (and get hints) on &lt;a href=&quot;https://www.librarything.com/topic/371626&quot;&gt;Talk&lt;/a&gt;.&lt;/li&gt;
  1648. &lt;/ul&gt;
  1649. &lt;p&gt;Win prizes:&lt;/p&gt;
  1650. &lt;ul&gt;
  1651. &lt;li&gt;Any member who finds at least two rainbows will be awarded a rainbow badge. &lt;a href=&quot;https://www.librarything.com/stats/MEMBERNAME/badges&quot;&gt;Badge&lt;/a&gt; (&lt;img height=&quot;16&quot; src=&quot;https://image.librarything.com/pics/hunt/hunt_badge_2024_06_32.png&quot; style=&quot;vertical-align: top; margin: 0 ;&quot; /&gt;).&lt;/li&gt;
  1652. &lt;li&gt;Members who find all 12 rainbows will be entered into a drawing for one of five sets of LibraryThing (or TinyCat) &lt;a href=&quot;https://www.librarything.com/more/store&quot;&gt;swag&lt;/a&gt;. We’ll announce winners at the end of the hunt.&lt;/li&gt;
  1653. &lt;/ul&gt;
  1654. &lt;p&gt;P.S. Thanks to &lt;a href=&quot;https://www.librarything.com/profile/conceptDawg&quot;&gt;conceptDawg&lt;/a&gt; for the scarlet ibis illustration.&lt;/p&gt;
  1655. &lt;p&gt;ConceptDawg has made all of our treasure hunt graphics in the last couple of years. We like them, and hope you do, too!&lt;/p&gt;</content>
  1656.      <author>
  1657.  <name>Abigail Adams</name>
  1658. <uri>https://blog.librarything.com</uri>
  1659.      </author>
  1660.    </entry>
  1661.  
  1662.    <entry>
  1663.      <title>Digital Library Federation: 2025 DLF Forum Program Now Available</title>
  1664.      <link rel="alternate" type="text/html" href="https://www.diglib.org/2025-dlf-forum-program-now-available/"/>
  1665.      <id>https://www.diglib.org/?p=86228</id>
  1666.      <updated>2025-06-17T15:20:06+00:00</updated>
  1667.      <content type="html">&lt;p&gt;&lt;a href=&quot;https://www.clir.org&quot; rel=&quot;noopener&quot; target=&quot;_blank&quot;&gt;CLIR&lt;/a&gt; is pleased to announce that the program is now available for the in-person &lt;a href=&quot;https://www.diglib.org/&quot;&gt;Digital Library Federation&lt;/a&gt;’s &lt;a href=&quot;https://forum2025.diglib.org&quot; rel=&quot;noopener&quot; target=&quot;_blank&quot;&gt;DLF Forum&lt;/a&gt; happening in Denver, Colorado November 17-19, 2025.&lt;/p&gt;
  1668. &lt;p&gt;&lt;strong&gt;&lt;a href=&quot;https://dlfforum2025.sched.com/&quot;&gt;The program for the in-person event is available here.&lt;/a&gt;&lt;/strong&gt;&lt;/p&gt;
  1669. &lt;p&gt;The DLF Forum welcomes digital library, archives, and museum practitioners from member institutions and beyond—for whom it serves as a meeting place, marketplace, and congress. Here, the DLF community celebrates successes, learns from mistakes, sets grassroots agendas, and organizes for action. The Forum is preceded by Learn@DLF, our pre-conference workshop day on November 16. &lt;a href=&quot;https://forum2025.diglib.org/&quot; rel=&quot;noopener&quot; target=&quot;_blank&quot;&gt;Learn more about the events.&lt;/a&gt;&lt;/p&gt;
  1670. &lt;p&gt;&lt;a href=&quot;https://forum2025.diglib.org/registration&quot; rel=&quot;noopener&quot; target=&quot;_blank&quot;&gt;Register for both events today&lt;/a&gt; to take advantage of early bird pricing and best Learn@DLF workshop availability, &lt;a href=&quot;https://forum2025.diglib.org/resources/conference-venue-and-hotel/&quot; rel=&quot;noopener&quot; target=&quot;_blank&quot;&gt;secure your spot in our discounted hotel block&lt;/a&gt; while supply lasts, and plan your trip with our &lt;a href=&quot;https://forum2025.diglib.org/resources/local-guide/&quot; rel=&quot;noopener&quot; target=&quot;_blank&quot;&gt;community-written local guide to Denver&lt;/a&gt;.&lt;/p&gt;
  1671. &lt;p&gt;&lt;a href=&quot;https://share.hsforms.com/1MhcafbpARxGCIS1OQD6rKgc21y3&quot; rel=&quot;noopener&quot; target=&quot;_blank&quot;&gt;Subscribe to our newsletter&lt;/a&gt; to be sure to hear all the Forum news first.&lt;/p&gt;
  1672. &lt;p&gt;The post &lt;a href=&quot;https://www.diglib.org/2025-dlf-forum-program-now-available/&quot; rel=&quot;nofollow&quot;&gt;2025 DLF Forum Program Now Available&lt;/a&gt; appeared first on &lt;a href=&quot;https://www.diglib.org&quot; rel=&quot;nofollow&quot;&gt;DLF&lt;/a&gt;.&lt;/p&gt;</content>
  1673.      <author>
  1674.  <name>Aliya Reich</name>
  1675. <uri>https://www.diglib.org</uri>
  1676.      </author>
  1677.    </entry>
  1678.  
  1679.    <entry>
  1680.      <title>David Rosenthal: The State Of Storage</title>
  1681.      <link rel="alternate" type="text/html" href="https://blog.dshr.org/2025/06/the-state-of-storage.html"/>
  1682.      <id>tag:blogger.com,1999:blog-4503292949532760618.post-5168575934727626275</id>
  1683.      <updated>2025-06-17T15:00:00+00:00</updated>
  1684.      <content type="html">&lt;i&gt;The Register&lt;/i&gt; is running a series on &lt;a href=&quot;https://www.theregister.com/special_features/the_state_of_storage/&quot;&gt;&lt;i&gt;The State Of Storage&lt;/i&gt;&lt;/a&gt;. Below the fold I flag some articles worth reading.&lt;br /&gt;
  1685. &lt;br /&gt;
  1686. &lt;span&gt;&lt;a name=&quot;more&quot;&gt;&lt;/a&gt;&lt;/span&gt;
  1687.  
  1688. Richard Speed's &lt;a href=&quot;https://www.theregister.com/2025/06/10/retro_storage/&quot;&gt;&lt;i&gt;Old but gold: Paper tape and punched cards still getting the job done – just about&lt;/i&gt;&lt;/a&gt; covers the issues hampering efforts to recover data from legacy media:&lt;br /&gt;
  1689. &lt;blockquote&gt;
  1690. A challenge with storage is the longevity of the media and the availability of hardware capable of retrieving it. While paper tape and punch cards are long-lasting and can be read by enterprising enthusiasts of today, modern densely packed SSDs and spinning disks might present more of a challenge for the archivists of tomorrow.
  1691. &lt;/blockquote&gt;
  1692. &lt;table cellpadding=&quot;0&quot; cellspacing=&quot;0&quot; class=&quot;tr-caption-container&quot; style=&quot;float: right;&quot;&gt;&lt;tbody&gt;&lt;tr&gt;&lt;td style=&quot;text-align: center;&quot;&gt;&lt;a href=&quot;https://regmedia.co.uk/2025/06/02/project-silica-glass.jpg&quot; style=&quot;clear: right; margin-bottom: 1em; margin-left: auto; margin-right: auto;&quot;&gt;&lt;img border=&quot;0&quot; height=&quot;200&quot; src=&quot;https://regmedia.co.uk/2025/06/02/project-silica-glass.jpg&quot; width=&quot;159&quot; /&gt;&lt;/a&gt;&lt;/td&gt;&lt;/tr&gt;&lt;tr&gt;&lt;td class=&quot;tr-caption&quot; style=&quot;text-align: center;&quot;&gt;&lt;a href=&quot;https://regmedia.co.uk/2025/06/02/project-silica-glass.jpg&quot;&gt;Project Silica tablet&lt;/a&gt;&lt;/td&gt;&lt;/tr&gt;&lt;/tbody&gt;&lt;/table&gt;
  1693. Chris Mellor's &lt;a href=&quot;https://www.theregister.com/2025/06/12/archival_storage_feature/&quot;&gt;&lt;i&gt;Tape, glass, and molecules – the future of archival storage&lt;/i&gt;&lt;/a&gt; looks at the issues I covered in &lt;a href=&quot;https://blog.dshr.org/2025/03/archival-storage.html&quot;&gt;&lt;i&gt;Archival Storage&lt;/i&gt;&lt;/a&gt;:&lt;br /&gt;
  1694. &lt;blockquote&gt;
  1695. The future of archival data storage is tape, more tape, and then possibly glass-based tech, with DNA and other molecular tech still a distant prospect.&lt;br /&gt;
  1696. &lt;br /&gt;
  1697. The function of archival storage is to keep data for the long term – decades and beyond – reliably and affordably. Currently, the main medium for this is LTO tape and it is slow, has a limited life, and not enough capacity considering the implications of ever-increasing image and video resolution and AI-infused data generation. However, there is as yet no viable tape replacement technology at scale, only possibilities, with optical storage more practical and nearer to productization than DNA or other molecular storage.
  1698. &lt;/blockquote&gt;
  1699. Mellor mostly agrees with me, but is more optimistic about Cerabyte's technology than I am. It is less aggressive than Project Silica, writing a single layer of data onto glass tablets. He reports that:&lt;br /&gt;
  1700. &lt;blockquote&gt;
  1701. Cerabyte has attracted investment from In-Q-Tel, Pure Storage, and Western Digital. Shantnu Sharma, WD's Chief Strategy and Corporate Development Officer, said: &quot;We are looking forward to working with Cerabyte to formulate a technology partnership for the commercialization of this technology.&quot;&lt;br /&gt;
  1702. ...&lt;br /&gt;
  1703. Like Project Silica, it is, in our assessment, two to five years away from commercial product availability, but it will then be commercially available.
  1704. &lt;/blockquote&gt;
  1705. Simon Sharwood's &lt;a&gt;&lt;i&gt;The trendline doesn’t look good for hard disk drives&lt;/i&gt;&lt;/a&gt; covers the evolution of the hard disk market to its current focus on nearline storage for hyperscalers:&lt;br /&gt;
  1706. &lt;blockquote&gt;
  1707. In early May, independent digital storage analyst Thomas Coughlin &lt;a href=&quot;https://www.forbes.com/sites/tomcoughlin/2025/05/03/c1q-2025-hdd-industry-update/&quot;&gt;shared&lt;/a&gt; news of falling sales and revenue in the first quarter of 2025, continuing a trend that started in around 2010. Coughlin &lt;a href=&quot;https://www.forbes.com/sites/tomcoughlin/2024/12/06/digital-storage-and-memory-projections-for-2025-part-1/&quot;&gt;cites&lt;/a&gt; data from that year showing around 600 million annual hard disk shipments.&lt;br /&gt;
  1708. &lt;br /&gt;
  1709. In 2025 he thinks around 150 million units will make it out of factory doors.
  1710. &lt;br /&gt;
  1711. Hyperscale datacenter operators will buy most of them and have become HDD manufacturers' largest customers.&lt;br /&gt;
  1712. ...&lt;br /&gt;
  1713. While HDD volumes fall, collective annual shipped HDD capacity is rising – so even though fewer machines are made, they collectively contain more gigabytes than last year's larger disk fleet.
  1714. &lt;/blockquote&gt;
  1715. &lt;table cellpadding=&quot;0&quot; cellspacing=&quot;0&quot; class=&quot;tr-caption-container&quot; style=&quot;float: right;&quot;&gt;&lt;tbody&gt;&lt;tr&gt;&lt;td style=&quot;text-align: center;&quot;&gt;&lt;a href=&quot;https://regmedia.co.uk/2025/05/27/hamr-graphic.jpg&quot; style=&quot;clear: right; margin-bottom: 1em; margin-left: auto; margin-right: auto;&quot;&gt;&lt;img border=&quot;0&quot; height=&quot;172&quot; src=&quot;https://regmedia.co.uk/2025/05/27/hamr-graphic.jpg&quot; width=&quot;200&quot; /&gt;&lt;/a&gt;&lt;/td&gt;&lt;/tr&gt;&lt;tr&gt;&lt;td class=&quot;tr-caption&quot; style=&quot;text-align: center;&quot;&gt;&lt;a href=&quot;https://regmedia.co.uk/2025/05/27/hamr-graphic.jpg&quot;&gt;HAMR&lt;/a&gt;&lt;/td&gt;&lt;/tr&gt;&lt;/tbody&gt;&lt;/table&gt;
  1716. Chris Mellor's &lt;a href=&quot;https://www.theregister.com/2025/06/09/hamr_100_tb_drive_feature/&quot;&gt;&lt;i&gt;Seagate still HAMRing away at the 100 TB disk drive decades later&lt;/i&gt;&lt;/a&gt; recounts the saga of Seagate's &lt;a href=&quot;https://web.archive.org/web/20190813175046/https://www.seagate.com/news/news-archive/seagate-swings-increase-disc-drive-densities-master-pr/&quot;&gt;two-and-a-half decade long&lt;/a&gt; effort to get Heat Assisted Magnetic Recording disks into mass production:&lt;br /&gt;
  1717. &lt;blockquote&gt;
  1718. Seagate is currently shipping 30 TB conventionally recorded Exos M HAMR drives and both 32 and 36 TB shingled magnetic recording (SMR) drives. These have blocks of partially overlapping write tracks to increase track density and thus overall capacity, but with a slower rewrite speed as entire blocks of tracks have to be rewritten.&lt;br /&gt;
  1719. &lt;br /&gt;
  1720. The company should have a clear drive capacity advantage, but it does not because its HAMR technology has proven extraordinarily difficult to bring into mass production and it has one less platter than competing drives. By 2012, Seagate predicted it could make 1 Tb/in² in HAMR drives after a successful &lt;a href=&quot;https://www.theregister.com/2012/03/20/seagate_terabit_areal_density/&quot;&gt;demo&lt;/a&gt;. That did not happen but it managed to make 16 TB HAMR drives and was sampling them in 2019. They did not make it to mass production either, and a 20 TB technology was the next development. It too did not last long.&lt;br /&gt;
  1721. &lt;br /&gt;
  1722. CEO Dave Mosley told &lt;a href=&quot;https://blocksandfiles.com/2020/11/20/seagate-ceo-talks-about-big-jumps/&quot;&gt;investment analysts in 2020&lt;/a&gt; that Seagate was stepping up to 24 TB HAMR drives. By 2023, a 30 TB HAMR drive was said to be coming. This was actually unveiled in &lt;a href=&quot;https://blocksandfiles.com/2024/01/17/seagate-hamr-drives/&quot;&gt;January 2024&lt;/a&gt;, with the tech branded Mozaic 3+, referring to the 32 and then 36 TB SMR variants.
  1723. &lt;/blockquote&gt;
  1724. Mellor notes the impact this history has had on Seagate's credibilty:&lt;br /&gt;
  1725. &lt;blockquote&gt;
  1726. Seagate has previously made several predictions about HAMR drive availability that were not ultimately achieved. The tech giant appears to have persistently underestimated how long it will take to productize HAMR technology and so reap the production cost benefit of having fewer platters and sliders at any capacity point than its two competitors.
  1727. &lt;/blockquote&gt;</content>
  1728.      <author>
  1729.  <name>David. (noreply@blogger.com)</name>
  1730. <uri>https://blog.dshr.org/</uri>
  1731.      </author>
  1732.    </entry>
  1733.  
  1734.    <entry>
  1735.      <title>Distant Reader Blog: Computing for the definition of intelligence</title>
  1736.      <link rel="alternate" type="text/html" href="https://distantreader.org/blog/computing-for-intelligence/"/>
  1737.      <id>https://distantreader.org/blog/computing-for-intelligence/</id>
  1738.      <updated>2025-06-17T04:00:00+00:00</updated>
  1739.      <content type="html">&lt;div&gt;
  1740.  &lt;p&gt;For a good time, I thought would use distant reading computing techniques to define the word &quot;intelligence&quot;, specifically, &quot;artificial intelligence&quot;.&lt;/p&gt;
  1741.  &lt;p&gt;I began by searching arxiv for pre-prints contgaining the phrase &quot;artificial intelligence&quot;. This resulted in approximately 15,000 records. Using the records' abstracts as proxies for the articles themselves, I created a Distant Reader study carrel (&quot;data set&quot;) from the citations. To give you an idea of the carrel's size and scope, the data set is 2.6 million words long (where the Moby Dick has about 0.25 million words long), and a word cloud illustrating the data set's keywords follows:&lt;/p&gt;
  1742.  &lt;p style=&quot;text-align: center;&quot;&gt;
  1743.    &lt;img alt=&quot;./keywords-cloud-small.png&quot; src=&quot;https://distantreader.org/blog/computing-for-intelligence/keywords-cloud-small.png&quot; /&gt;
  1744.    &lt;br /&gt;
  1745.    &lt;a href=&quot;https://distantreader.org/blog/computing-for-intelligence/keywords-cloud.png&quot;&gt;Wordcloud illustrating the study carrel's scope&lt;/a&gt;
  1746.  &lt;/p&gt;
  1747.  &lt;p&gt;I then extracted all of the sentences from all of the abstracts. I vectorized (&quot;indexed&quot;) the sentences, and stored the result in a database. I then searched the index for the word &quot;intelligence&quot; and specified the index to return the single closest sentence from the result. Apparently the closest sentence comes from an article called &quot;The relationship between Biological and Artificial Intelligence&quot; by George Cevora (2019 - &lt;a href=&quot;https://arxiv.org/abs/1905.00547&quot;&gt;https://arxiv.org/abs/1905.00547&lt;/a&gt;):&lt;/p&gt;
  1748.  &lt;blockquote&gt;
  1749.    &lt;p&gt;Intelligence can be defined as a predominantly human ability to accomplish tasks that are generally hard for computers and animals.&lt;/p&gt;
  1750.  &lt;/blockquote&gt;
  1751.  &lt;p&gt;Not a great definition because it defines things in the negative, but a good start, if you ask me.&lt;/p&gt;
  1752.  &lt;p&gt;In order to garner a more well-rounded definition, I expanded my query to return 20 sentences, and I got the following result:&lt;/p&gt;
  1753.  &lt;blockquote&gt;
  1754.    &lt;p&gt;This paper is a survey of a large number of informal definitions of ``intelligence'' that the authors have collected over the years. Finally, we survey the many other tests and definitions of intelligence that have been proposed for machines. Independent from the still ongoing research in measuring individual intelligence, we anticipate and provide a framework for measuring collective intelligence. Our understanding of intelligence is directed primarily at the human level. In this regard, intelligence becomes a product of information processing. In such a case, intelligence arises as a result of information processing. The discussion at hand is about the elements that constitute the canonical idea of intelligence. We define intelligence as the ability to acquire skills that narrow this search, diversify it and help steer it to more promising areas. To accomplish this, we focus on three basic aspects of intelligence. Intelligence today is considered as a fundamental property of each and every living being. Intelligence can be defined as a predominantly human ability to accomplish tasks that are generally hard for computers and animals. Artificial Intelligence An important aspect of intelligence is the ability to adapt to a novel task without any direct experience (zero-shot), based on its relationship to previous tasks. Intelligence is not a single weighable quantity but a subject that spans Biology, Physics, Philosophy, Cognitive Science, Neuroscience, Psychology, and Computer Science. (Artificial Intelligence) Is understanding a necessary and sufficient condition for intelligence? The disadvantages of this type of intelligence are discussed. Artificial Intelligence Nonetheless, most of these systems lack many capabilities that we would naturally consider to be included in a notion of &quot;intelligence&quot;. Human intelligence has the remarkable ability to assemble basic skills into complex ones so as to solve complex tasks.&lt;/p&gt;
  1755.  &lt;/blockquote&gt;
  1756.  &lt;p&gt;Such is more difficult to consume, but if I measure the cosine distances between each sentence and its sucessor, I can tranform the result into pseudo-paragraphs, which is (usually) easier to consume:&lt;/p&gt;
  1757.  &lt;blockquote&gt;
  1758.    &lt;p&gt;
  1759.      &lt;/p&gt;&lt;p&gt;This paper is a survey of a large number of informal definitions of ``intelligence'' that the authors have collected over the years.&lt;/p&gt;
  1760.      &lt;p&gt;Finally, we survey the many other tests and definitions of intelligence that have been proposed for machines. Independent from the still ongoing research in measuring individual intelligence, we anticipate and provide a framework for measuring collective intelligence. Our understanding of intelligence is directed primarily at the human level. In this regard, intelligence becomes a product of information processing. In such a case, intelligence arises as a result of information processing. The discussion at hand is about the elements that constitute the canonical idea of intelligence. We define intelligence as the ability to acquire skills that narrow this search, diversify it and help steer it to more promising areas. To accomplish this, we focus on three basic aspects of intelligence. Intelligence today is considered as a fundamental property of each and every living being. Intelligence can be defined as a predominantly human ability to accomplish tasks that are generally hard for computers and animals. Artificial Intelligence An important aspect of intelligence is the ability to adapt to a novel task without any direct experience (zero-shot), based on its relationship to previous tasks. Intelligence is not a single weighable quantity but a subject that spans Biology, Physics, Philosophy, Cognitive Science, Neuroscience, Psychology, and Computer Science. (Artificial Intelligence) Is understanding a necessary and sufficient condition for intelligence? The disadvantages of this type of intelligence are discussed. Artificial Intelligence Nonetheless, most of these systems lack many capabilities that we would naturally consider to be included in a notion of &quot;intelligence&quot;. Human intelligence has the remarkable ability to assemble basic skills into complex ones so as to solve complex tasks.&lt;/p&gt;
  1761.    &lt;p&gt;&lt;/p&gt;
  1762.  &lt;/blockquote&gt;
  1763.  &lt;p&gt;This particular example is not extraordinarily helpful because only two &quot;paragraphs&quot; are returned. This tells me each sentence is very similar to its predecessor.&lt;/p&gt;
  1764.  &lt;p&gt;To distill the original results another way, I can submit it to an LLM and tell the model to summarize:&lt;/p&gt;
  1765.  &lt;blockquote&gt;
  1766.    &lt;p&gt;The paper surveys various informal definitions of intelligence and proposes a framework for measuring collective intelligence independent of individual intelligence research. The authors define intelligence as the ability to acquire skills that narrow, diversify, and steer search towards more promising areas. They focus on three aspects of intelligence: (1) adapting to novel tasks without direct experience (zero-shot), (2) assembling basic skills into complex ones to solve complex tasks, and (3) understanding. The paper also discusses the limitations of artificial intelligence systems in terms of their lack of certain capabilities that are inherent in human intelligence. Finally, the authors argue that intelligence is a multidisciplinary subject that spans biology, physics, philosophy, cognitive science, neuroscience, psychology, and computer science.&lt;/p&gt;
  1767.  &lt;/blockquote&gt;
  1768.  &lt;p&gt;Again, not bad, especially if you take the time to close read the pseudo-paragraphs.&lt;/p&gt;
  1769.  &lt;p&gt;Finally, I submitted the original results to another LLM, but this time I posed a question, &quot;What is intelligence?&quot;, and what follows is the result:&lt;/p&gt;
  1770.  &lt;blockquote&gt;
  1771.    &lt;p&gt;
  1772.      &lt;/p&gt;&lt;p&gt;Intelligence is a multifaceted concept that has been studied and debated by various disciplines, including biology, psychology, neuroscience, philosophy, and computer science. At its core, intelligence can be defined as the ability to acquire and apply knowledge, skills, and adaptability in various tasks and situations. This includes the ability to process information, learn from experience, and solve complex problems.&lt;/p&gt;
  1773.      &lt;p&gt;The canonical idea of intelligence suggests that it is a fundamental property of living beings, including humans, animals, and potentially even artificial intelligence. However, there are also different types of intelligence, such as zero-shot learning, which allows AI systems to adapt to novel tasks without any direct experience.&lt;/p&gt;
  1774.      &lt;p&gt;The discussion around intelligence often focuses on the elements that constitute it, such as the ability to learn, reason, and adapt. While understanding is considered a necessary condition for intelligence, it may not be sufficient on its own. Other factors, such as creativity, problem-solving skills, and emotional intelligence, are also important aspects of human intelligence.&lt;/p&gt;
  1775.      &lt;p&gt;Despite the progress made in artificial intelligence, most systems lack many capabilities that are considered essential for true intelligence, such as common sense, creativity, and emotional intelligence. Therefore, while AI can perform certain tasks with great efficiency, it may not be considered truly intelligent in the way that humans are.&lt;/p&gt;
  1776.      &lt;p&gt;In summary, intelligence is a complex and multifaceted concept that encompasses various aspects of information processing, learning, and adaptability. While AI systems have made significant progress in mimicking human-like intelligence, there are still many capabilities that are unique to humans and may not be easily replicable by machines.&lt;/p&gt;
  1777.    &lt;p&gt;&lt;/p&gt;
  1778.  &lt;/blockquote&gt;
  1779.  &lt;p&gt;So? What's the point? We are all suffering from information overload, and we use various tools to help us overcome this problem. Some of those tools include: tables of contents, back-of-the-book indexes, page numbers, running chapter headings, eye glasses, pencils and pens, bibliographic management applications, etc. Why not use the computing features of a computer?&lt;/p&gt;
  1780.  &lt;p&gt;Important! The results of this investigation do not return truth. Instead measurements and observations are returned which are expected to be followed up with close reading. To this end, all of the sentences returned from the original search results can be pointed to in their original documents.&lt;/p&gt;
  1781.  &lt;p&gt;Similarly, the results are intended to be jumping off points for dicussion. For example, if I were to ask you &quot;What is intelligence?&quot;, then you would say a few things, and somebody else would say a few more things, but this process affords you a larger vocabulary for discussion. You can use the results to &quot;riff&quot; off the original question.&lt;/p&gt;
  1782.  &lt;p&gt;So, what is intelligence? Read the results above and discuss.&lt;/p&gt;
  1783.  &lt;p&gt;For extra credit I could apply the same process to works of epistemology, and then compare and contrast. For example, what do Plato, Aquinas, Locke, Hegel, Kant, etc. have to say on topic of intelligence. Then ask yourself, &quot;If I were to use traditional reading technques to accomplish this task, then how long would it take me?&quot;&lt;/p&gt;
  1784. &lt;/div&gt;</content>
  1785.      <author>
  1786.    <name>Distant Reader Blog</name>
  1787. <uri>https://distantreader.org/blog/</uri>
  1788.      </author>
  1789.    </entry>
  1790.  
  1791.    <entry>
  1792.      <title>Distant Reader Blog: Aversions to Artificial Intelligence (AI)</title>
  1793.      <link rel="alternate" type="text/html" href="https://distantreader.org/blog/aversions-to-ai/"/>
  1794.      <id>https://distantreader.org/blog/aversions-to-ai/</id>
  1795.      <updated>2025-06-17T04:00:00+00:00</updated>
  1796.      <content type="html">&lt;div&gt;
  1797.  &lt;p&gt;If LLMs were not created with copyrighted materials, or, if LLMs were created with copyrighted materials, but access to the LLMs was limited only to authorized persons, then do you think the library community would be more amenable to their use?&lt;/p&gt;
  1798.  &lt;p&gt;Generative-AI kinda sorta works like this:&lt;/p&gt;
  1799.  &lt;ol&gt;
  1800.    &lt;li&gt;amass a set of documents&lt;/li&gt;
  1801.    &lt;li&gt;parse the documents into sentence-like chunks&lt;/li&gt;
  1802.    &lt;li&gt;create a document/term matrix (&quot;vectorize&quot;) of all the sentences&lt;/li&gt;
  1803.    &lt;li&gt;save the sentence/vector combinations in a database&lt;/li&gt;
  1804.    &lt;li&gt;garner a query&lt;/li&gt;
  1805.    &lt;li&gt;vectorize the query&lt;/li&gt;
  1806.    &lt;li&gt;compute the similariaties (&quot;distances&quot;) between the vectorized query and each of the stored vectors&lt;/li&gt;
  1807.    &lt;li&gt;return the N most similar sentences&lt;/li&gt;
  1808.    &lt;li&gt;done&lt;/li&gt;
  1809.  &lt;/ol&gt;
  1810.  &lt;p&gt;The problem the library community seems to have is with Step #3, which is often called &quot;embedding&quot; and is done by things called &quot;embedders&quot;. These embedders are representations of their original documents. They, in and of themselves, are sets of vectors describing the distances between documents. (Think &quot;Markov models&quot;, only on steroids.) In order to create these embedders, both a whole lot of content and computing horsepower are necessary; a huge amount of linear algebra needs to be computed agaist a huge amount of sentences in order identify useful distance measures. Many, if not most, of these embedders were created from content harvested from the Web. Alas?&lt;/p&gt;
  1811.  &lt;p&gt;Now suppose the library community were to create one or more of their own embedders? Suppose the content came from things like Project Gutenberg, the HathiTrust, open access journal articles, public domain government documentss, etc? Suppose the process of vectorizing -- the creation a document/term matrixes -- was done with something as simple as
  1812.  &lt;a href=&quot;https://scikit-learn.org/stable/modules/generated/sklearn.feature_extraction.text.CountVectorizer.html&quot;&gt;Scikit Learn's CountVectorizer&lt;/a&gt;. The generative-AI results may not be as good as the results generated by Big Tech, but at least the facet of copyright would have been eliminated.&lt;/p&gt;
  1813.  &lt;p&gt;Suppose the copyright facet were removed from the LLM equation. Do you think the the library community would be as averse to the use of generative-AI? If not, then I assert the aversion is not necessarily copyright but something else, and if so, then what? Put another way, besides copyright, what aversions to the use of generatie-AI does the library community seem to have? Environmentaly unfriendly? Too easy? Requires additional information literacy skills? Professionally threatening? Fear of change? Minimal understanding of how generative-AI works?&lt;/p&gt;
  1814.  &lt;p&gt;Don't get me wrong. I have not been drinking any Kool-Aid, and that said, I don't understand what the problem is. To me the use of LLMs is merely another tool in my library toolbox.&lt;/p&gt;
  1815. &lt;/div&gt;</content>
  1816.      <author>
  1817.    <name>Distant Reader Blog</name>
  1818. <uri>https://distantreader.org/blog/</uri>
  1819.      </author>
  1820.    </entry>
  1821.  
  1822.    <entry>
  1823.      <title>Tara Robertson: Fireside chat with Snotty Nose Rez Kids</title>
  1824.      <link rel="alternate" type="text/html" href="https://tararobertson.ca/2025/fireside-chat-with-snotty-nose-rez-kids/"/>
  1825.      <id>https://tararobertson.ca/?p=6961</id>
  1826.      <updated>2025-06-16T14:14:44+00:00</updated>
  1827.      <content type="html">&lt;p&gt;&lt;a href=&quot;https://tararobertson.ca/wp-content/uploads/2025/06/Tara-Robertson-3-scaled.jpg&quot;&gt;&lt;img alt=&quot;me on the Board of Vancouver stage with Snotty Nose Rez Kids&quot; class=&quot;aligncenter size-large wp-image-6962&quot; height=&quot;800&quot; src=&quot;https://tararobertson.ca/wp-content/uploads/2025/06/Tara-Robertson-3-1024x800.jpg&quot; width=&quot;1024&quot; /&gt;&lt;/a&gt;&lt;/p&gt;
  1828. &lt;p&gt;The &lt;a href=&quot;https://www.boardoftrade.com/&quot;&gt;Greater Vancouver Board of Trade&lt;/a&gt; invited me to do a fireside chat with&lt;a href=&quot;https://snottynoserezkids.com/&quot;&gt; Snotty Nose Rez Kids&lt;/a&gt; for their EDI Forum. Snotty Nose Rez Kids are an Indigenous hip hop duo who have won so many awards. Darren Metz and Quinton Nyce were so real and generous. Our conversation hit a lot of topics including Indigenous futures, matriarchs, what they mean about being a good ancestor and how they work together. They’re incredible artists and leaders.&lt;/p&gt;
  1829. &lt;p&gt;They model being unapologetic and bold in a way that creates more space for all of us to take up space being more of ourselves. &lt;/p&gt;
  1830. &lt;p&gt;Two of the things that they said stuck with me: First, “We’re not just making records, we’re making history”. Second, “It’s not about how far we’ve come, it’s about how far we can go.”&lt;/p&gt;
  1831. &lt;p&gt;Snotty Nose Rez Kids are on tour across North America. Go and see them perform! &lt;/p&gt;
  1832. &lt;p&gt;The post &lt;a href=&quot;https://tararobertson.ca/2025/fireside-chat-with-snotty-nose-rez-kids/&quot;&gt;Fireside chat with Snotty Nose Rez Kids&lt;/a&gt; appeared first on &lt;a href=&quot;https://tararobertson.ca&quot;&gt;Tara Robertson Consulting&lt;/a&gt;.&lt;/p&gt;</content>
  1833.      <author>
  1834.  <name>Tara Robertson</name>
  1835. <uri>https://tararobertson.ca/</uri>
  1836.      </author>
  1837.    </entry>
  1838.  
  1839.    <entry>
  1840.      <title>David Rosenthal: Going Out With A Bang</title>
  1841.      <link rel="alternate" type="text/html" href="https://blog.dshr.org/2025/04/going-out-with-bang.html"/>
  1842.      <id>tag:blogger.com,1999:blog-4503292949532760618.post-5718007253758567340</id>
  1843.      <updated>2025-06-16T14:08:30+00:00</updated>
  1844.      <content type="html">In &lt;a href=&quot;https://blog.dshr.org/2024/10/15c-here-we-come.html&quot;&gt;&lt;i&gt;1.5C Here We Come&lt;/i&gt;&lt;/a&gt; I criticized people like &lt;a href=&quot;https://www.nytimes.com/2024/10/22/climate/ai-big-tech-emissions.html&quot;&gt;Eric Schmidt&lt;/a&gt; who said that:&lt;br /&gt;
  1845. &lt;blockquote&gt;
  1846. the artificial intelligence boom was too powerful, and had too much potential, to let concerns about climate change get in the way.&lt;br /&gt;
  1847. &lt;br /&gt;
  1848. Schmidt, somewhat fatalistically, &lt;a href=&quot;https://disconnect.blog/silicon-valley-is-sacrificing-the-climate-for-ai/&quot;&gt;said that&lt;/a&gt; “we’re not going to hit the climate goals anyway,”
  1849. &lt;/blockquote&gt;
  1850. &lt;table cellpadding=&quot;0&quot; cellspacing=&quot;0&quot; class=&quot;tr-caption-container&quot; style=&quot;float: right;&quot;&gt;&lt;tbody&gt;&lt;tr&gt;&lt;td style=&quot;text-align: center;&quot;&gt;&lt;a href=&quot;https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEi8B-JY0hKk1-6EFCHoZ4woTd_u3dqYKQJbgLzPbhmju1GrB3B4CY2HtLMprsuox7djAeXnHTeVRp_mR0G5j6yHUJq1SclVBLeMVyhqZDHEo6NY1Ly63u3hSB8kpR2gBYR2cNLi1xuN3EX6NmTvoTQBDtGTV1rx2rMmFScHkVd9-nYI8V3sxQqrDZSbxmcZ/s979/Salom%C3%A9_Balthus_(cropped).jpg&quot; style=&quot;clear: right; margin-bottom: 1em; margin-left: auto; margin-right: auto;&quot;&gt;&lt;img border=&quot;0&quot; height=&quot;200&quot; src=&quot;https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEi8B-JY0hKk1-6EFCHoZ4woTd_u3dqYKQJbgLzPbhmju1GrB3B4CY2HtLMprsuox7djAeXnHTeVRp_mR0G5j6yHUJq1SclVBLeMVyhqZDHEo6NY1Ly63u3hSB8kpR2gBYR2cNLi1xuN3EX6NmTvoTQBDtGTV1rx2rMmFScHkVd9-nYI8V3sxQqrDZSbxmcZ/w150-h200/Salom%C3%A9_Balthus_(cropped).jpg&quot; width=&quot;150&quot; /&gt;&lt;/a&gt;&lt;/td&gt;&lt;/tr&gt;&lt;tr&gt;&lt;td class=&quot;tr-caption&quot; style=&quot;text-align: center;&quot;&gt;&lt;small&gt;&lt;a href=&quot;https://en.wikipedia.org/wiki/Hanna_Lakomy&quot;&gt;Salomé Balthus&lt;/a&gt;&lt;br /&gt;&lt;a href=&quot;https://commons.wikimedia.org/w/index.php?curid=119928019&quot;&gt;Uwe Hauth, CC BY-SA 4.0&lt;/a&gt;&lt;/small&gt;&lt;/td&gt;&lt;/tr&gt;&lt;/tbody&gt;&lt;/table&gt;
  1851. In January for a &lt;i&gt;Daily Mail&lt;/i&gt; article, Miriam Kuepper interviewed &lt;a href=&quot;https://en.wikipedia.org/wiki/Hanna_Lakomy&quot;&gt;Salomé Balthus&lt;/a&gt; a &quot;high-end escort and author from Berlin&quot; who works the &lt;a href=&quot;https://en.wikipedia.org/wiki/World_Economic_Forum&quot;&gt;World Economic Forum&lt;/a&gt;. Balthus reported attitudes that clarify why &quot;3C Here We Come&quot; is more likely. The &lt;a href=&quot;https://www.dailymail.co.uk/news/article-14314191/global-elite-Davos-high-class-escort-spills-beans.html&quot;&gt;article's full title&lt;/a&gt; is:&lt;br /&gt;
  1852. &lt;blockquote&gt;
  1853. What the global elite reveal to Davos sex workers: High-class escort spills the beans on what happens behind closed doors - and how wealthy 'know the world is doomed, so may as well go out with a bang'
  1854. &lt;/blockquote&gt;
  1855. Below the fold I look into a wide range of evidence that Balthus' clients were telling her the truth.&lt;br /&gt;
  1856. &lt;span&gt;&lt;a name=&quot;more&quot;&gt;&lt;/a&gt;&lt;/span&gt;
  1857. &lt;br /&gt;
  1858. Kuepper &lt;a href=&quot;https://www.dailymail.co.uk/news/article-14314191/global-elite-Davos-high-class-escort-spills-beans.html&quot;&gt;quotes Balthus&lt;/a&gt;:&lt;br /&gt;
  1859. &lt;blockquote&gt;
  1860. 'The elephant in the room is climate change. Everyone knows it can't be prevented any more,' she said, adding that the 'super rich' could generally be split into two groups on the topic.&lt;br /&gt;
  1861. &lt;br /&gt;
  1862. 'The one group thinks it only affects the poor, the &quot;not-white race&quot;, while the others fear that it could get worse but there's no sense in trying to do anything about it so they just enjoy themselves,' she told MailOnline.&lt;br /&gt;
  1863. &lt;br /&gt;
  1864. 'The one half is in despair and the other, dumber, half is celebrating future mass deaths. &lt;br /&gt;
  1865. &lt;br /&gt;
  1866. Salome elaborated that some of the uber wealthy people fitting into the first group were saying that those in third world countries 'might all die but us in the North, we're fine'.&lt;br /&gt;
  1867. &lt;br /&gt;
  1868. She said: 'They say that in a democracy you have to sell it, to lie to people and tell them &quot;we didn't know better and didn't think it would get this bad&quot;, not admitting that they know.&lt;br /&gt;
  1869. &lt;br /&gt;
  1870. 'Then there's the other group that thinks it might not be so easy, maybe it will also affect us due to unforeseeable chain reactions.&lt;br /&gt;
  1871. &lt;br /&gt;
  1872. 'But they say they can't do anything against the others so they live following the mantra &quot;after us, the deluge&quot;. &lt;br /&gt;
  1873. &lt;br /&gt;
  1874. 'They say they will enjoy a few more nice years on earth and know that there's no future. They are very cynical and somehow deeply sad.'
  1875. &lt;/blockquote&gt;
  1876. This attitude matches Schmidt's fatalism — we're doomed but we might as well make money/have fun until then. What it misses is that everything they're doing is bringing &quot;until then&quot; closer. As I wrote &lt;a href=&quot;https://blog.dshr.org/2024/10/15c-here-we-come.html&quot;&gt;about Schmidt&lt;/a&gt;:&lt;br /&gt;
  1877. &lt;blockquote&gt;
  1878. He is right that “we’re not going to hit the climate goals anyway&quot;, but that is partly his fault. Even assuming that he's right and AI is capable of magically &quot;solving the problem&quot;, the magic solution &lt;i&gt;won't be in place until long after 2027&lt;/i&gt;, which is when at the current rate we will pass 1.5C. And everything that the tech giants are doing &lt;i&gt;right now&lt;/i&gt; is moving the 1.5C date closer.
  1879. &lt;/blockquote&gt;
  1880. After all, how bad could it be? The economists' models predict a fairly small decrease in global GDP, but Graham Readfearn reports that &lt;a href=&quot;https://www.theguardian.com/environment/2025/apr/01/average-person-will-be-40-poorer-if-world-warms-by-4c-new-research-shows&quot;&gt;&lt;i&gt;Average person will be 40% poorer if world warms by 4C, new research shows&lt;/i&gt;&lt;/a&gt;:&lt;br /&gt;
  1881. &lt;blockquote&gt;
  1882. Economic models have systematically underestimated how global heating will affect people’s wealth, according to a new study that finds 4C warming will make the average person 40% poorer – an almost four-fold increase on some estimates.&lt;br /&gt;
  1883. &lt;/blockquote&gt;
  1884. &lt;table cellpadding=&quot;0&quot; cellspacing=&quot;0&quot; class=&quot;tr-caption-container&quot; style=&quot;float: right;&quot;&gt;&lt;tbody&gt;&lt;tr&gt;&lt;td style=&quot;text-align: center;&quot;&gt;&lt;a href=&quot;https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgLao1xWyiYebYLR-Z8XA4Ly-hDUItA-Nzfv8hUF75yJiBxM0hiEmyJzsbpu8XWm0cp-0gxq9SYlXKre3xC9jaSaE_u1cN66_AoFBL88NiWZg6x_QJ4-7GpJQjSLuaSxnfRf24Sda-rhQLyFi1n8kz5RcfDU8RlPhc17xobsO6hEiH7vwTulnQrrI1Nny0h/s454/FIg-1.jpg&quot; style=&quot;clear: right; margin-bottom: 1em; margin-left: auto; margin-right: auto;&quot;&gt;&lt;img border=&quot;0&quot; height=&quot;200&quot; src=&quot;https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgLao1xWyiYebYLR-Z8XA4Ly-hDUItA-Nzfv8hUF75yJiBxM0hiEmyJzsbpu8XWm0cp-0gxq9SYlXKre3xC9jaSaE_u1cN66_AoFBL88NiWZg6x_QJ4-7GpJQjSLuaSxnfRf24Sda-rhQLyFi1n8kz5RcfDU8RlPhc17xobsO6hEiH7vwTulnQrrI1Nny0h/w165-h200/FIg-1.jpg&quot; width=&quot;165&quot; /&gt;&lt;/a&gt;&lt;/td&gt;&lt;/tr&gt;&lt;tr&gt;&lt;td class=&quot;tr-caption&quot; style=&quot;text-align: center;&quot;&gt;&lt;a href=&quot;https://doi.org/10.1088/1748-9326/adbd58&quot;&gt;Neal &lt;i&gt;et al&lt;/i&gt; Fig 1&lt;/a&gt;&lt;/td&gt;&lt;/tr&gt;&lt;/tbody&gt;&lt;/table&gt;
  1885. The reason for the underestimation is that their model of the effect of climate change on a country's GDP accounts only for in-country effects. &lt;a href=&quot;https://doi.org/10.1088/1748-9326/adbd58&quot;&gt;&lt;i&gt;Reconsidering the macroeconomic damage of severe warming&lt;/i&gt;&lt;/a&gt; by Timothy Neal &lt;i&gt;et al&lt;/i&gt; from the University of New South Wales instead models global weather's effects on a world of interconnected supply chains:&lt;br /&gt;
  1886. &lt;blockquote&gt;
  1887. Figure 1 shows the projected percentage reduction in global GDP from a high emissions future (SSP5-8.5) relative to a lower emissions future (SSP1-2.6), for the three models outlined in section 2.2. Each economic model is run with and without global weather to determine its impact on the projections. Without the inclusion of global weather (blue line), all three models project more mild economic losses with a median loss at 2100 of −28% for the Burke15 model, −4% for the Kahn21 model, and −11% for the Kotz24 model. Projected losses from the latter two models are more optimistic than in the original articles, likely due to the variations in data and exact assumptions made.
  1888. &lt;/blockquote&gt;
  1889. Readfearn &lt;a href=&quot;https://www.theguardian.com/environment/2025/apr/01/average-person-will-be-40-poorer-if-world-warms-by-4c-new-research-shows&quot;&gt;notes that&lt;/a&gt;:
  1890. &lt;br /&gt;
  1891. &lt;blockquote&gt;
  1892. The study by Australian scientists suggests average per person GDP across the globe will be reduced by 16% even if warming is kept to 2C above pre-industrial levels. This is a much greater reduction than previous estimates, which found the reduction would be 1.4%.&lt;br /&gt;
  1893. &lt;br /&gt;
  1894. &lt;a href=&quot;https://climateactiontracker.org/global/cat-thermometer/&quot;&gt;Scientists now estimate global temperatures will rise by 2.1C&lt;/a&gt; even if countries hit short-term and long-term climate targets.
  1895. &lt;/blockquote&gt;
  1896. If global GDP drops 16% &quot;us in the North&quot; and specifically in the US are unlikely to be fine. Not that &quot;us&quot; in the US are that fine right now, as Aria Bendix' &lt;a href=&quot;https://www.nbcnews.com/health/health-news/not-even-wealth-saving-americans-dying-rates-seen-poorest-europeans-rcna198929&quot;&gt;&lt;i&gt;Not even wealth is saving Americans from dying at rates seen among some of the poorest Europeans&lt;/i&gt;&lt;/a&gt; shows:&lt;br /&gt;
  1897. &lt;blockquote&gt;
  1898. Today, the wealthiest middle-aged and older adults in the U.S. have roughly the same likelihood of dying over a 12-year period as the poorest adults in northern and western Europe, according to &lt;a href=&quot;https://dx.doi.org/10.1056/NEJMsa2408259&quot;&gt;a study published Wednesday&lt;/a&gt; in The New England Journal of Medicine.
  1899. &lt;/blockquote&gt;
  1900. Paddy Manning's &lt;a href=&quot;https://www.bloomberg.com/news/articles/2025-04-18/extreme-weather-s-true-death-toll-is-becoming-clearer&quot;&gt;&lt;i&gt;The Death Toll From Extreme Weather Is Becoming Clearer&lt;/i&gt;&lt;/a&gt; shows it will get much worse:&lt;br /&gt;
  1901. &lt;blockquote&gt;
  1902. Heat waves, wildfires, floods, tropical storms and hurricanes are all increasing in scale, frequency and intensity, and the World Health Organization &lt;a href=&quot;https://www.who.int/news-room/fact-sheets/detail/climate-change-and-health&quot;&gt;forecasts&lt;/a&gt; that climate change will cause 250,000 additional deaths each year by the end of this decade from undernutrition, malaria, diarrhea and heat stress alone. Even so, the impact on human health and the body count attributed to extreme weather remain massively underreported — resulting in a damaging feedback loop of policy inaction. Meanwhile, the very people who might fix that problem, at least in the US, are being fired en masse amid the Trump administration’s war on science.
  1903. &lt;/blockquote&gt;
  1904. &lt;table cellpadding=&quot;0&quot; cellspacing=&quot;0&quot; class=&quot;tr-caption-container&quot; style=&quot;float: right;&quot;&gt;&lt;tbody&gt;&lt;tr&gt;&lt;td style=&quot;text-align: center;&quot;&gt;&lt;a href=&quot;https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhsX_WxAoiqKRNoMb03MURQ0GjMeU3UmrG-fXYfUrIHyPxWV7P3eDgps6V9tksSNATNwHpF9FvJvbKNFE6RdrED-3Pa3OcllMD8ST86yN3q18UI9ZgXKeFbct0T56nm2IqucISDMQs7MzM2k2EwNUSwj6MxNsSbnRRXE6mmdE1h3X-wFOCOrUAWXwo4_c25/s1080/Miyazaki.jpeg&quot; style=&quot;clear: right; margin-bottom: 1em; margin-left: auto; margin-right: auto;&quot;&gt;&lt;img border=&quot;0&quot; height=&quot;174&quot; src=&quot;https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhsX_WxAoiqKRNoMb03MURQ0GjMeU3UmrG-fXYfUrIHyPxWV7P3eDgps6V9tksSNATNwHpF9FvJvbKNFE6RdrED-3Pa3OcllMD8ST86yN3q18UI9ZgXKeFbct0T56nm2IqucISDMQs7MzM2k2EwNUSwj6MxNsSbnRRXE6mmdE1h3X-wFOCOrUAWXwo4_c25/w200-h174/Miyazaki.jpeg&quot; width=&quot;200&quot; /&gt;&lt;/a&gt;&lt;/td&gt;&lt;/tr&gt;&lt;tr&gt;&lt;td class=&quot;tr-caption&quot; style=&quot;text-align: center;&quot;&gt;&lt;a href=&quot;https://x.com/Angelica_Reed1/status/1905508711671423021/photo/1&quot;&gt;Source&lt;/a&gt;&lt;/td&gt;&lt;/tr&gt;&lt;/tbody&gt;&lt;/table&gt;
  1905. But we know countries aren't going to &quot;hit short-term and long-term climate targets&quot; because, among other reasons, it would prevent us achieving the wonderful benefits of AI such as generated images in the &lt;a href=&quot;https://maxread.substack.com/p/studio-ghibli-in-the-age-of-ai-reproduction&quot;&gt;style of Studio Ghibli&lt;/a&gt;.&lt;br /&gt;
  1906. &lt;br /&gt;
  1907. Kate Aranoff explains that people like Schmidt aren't talking 2C any longer in &lt;a href=&quot;https://newrepublic.com/article/193698/climate-realism-degrees-immigration&quot;&gt;&lt;i&gt;The Bleak, Defeatist Rise of “Climate Realism”&lt;/i&gt;&lt;/a&gt;:&lt;br /&gt;
  1908. &lt;blockquote&gt;
  1909. Owing to “recent setbacks to global decarbonization efforts,” Morgan Stanley analysts &lt;a href=&quot;https://www.eenews.net/articles/big-banks-predict-catastrophic-warming-with-profit-potential/&quot;&gt;wrote&lt;/a&gt; in a research report last month, they “now expect a 3°C world.” The “baseline” scenario that JP Morgan Chase uses to assess its own transition risk—essentially, the economic impact that decarbonization could have on its high-carbon investments—similarly “assumes that no additional emissions reduction policies are implemented by governments” and that the world could reach “3°C or more of warming” by 2100. The &lt;a href=&quot;https://www.axios.com/2025/04/07/climate-change-realism-security&quot;&gt;Climate Realism Initiative&lt;/a&gt; launched on Monday by the Council on Foreign Relations similarly presumes that the world is likely on track to warm on average by three degrees or more this century. The essay announcing the initiative calls the prospect of reaching net-zero global emissions by 2050 “utterly implausible.”&lt;br /&gt;
  1910. ...&lt;br /&gt;
  1911. Bleak as warming projections are, a planet where governments and businesses fight to the death for their own profitable share of a hotter, more chaotic planet is bleaker still. The only thing worse than a right wing that doesn’t take climate change seriously might be one that does, and can muster support from both sides of the aisle to put “America First” in a warming, warring world.
  1912. &lt;/blockquote&gt;
  1913. Of course, we might as well &quot;enjoy a few more nice years on earth&quot;, because the 70-year-old Schmidt (and I) will be dead long before 2100. Our grandchildren will just have to figure something out. In the meantime we need to make as much money as possible so the grandchildren can afford their bunkers.&lt;br /&gt;
  1914. &lt;br /&gt;
  1915. Lets look at some of the actions of the global elite instead of the words they use when not talking to their escorts, starting with Nvidia the picks and shovels provider to the AI boom.&lt;br /&gt;
  1916. &lt;br /&gt;
  1917. Tobias Mann's &lt;a href=&quot;https://www.theregister.com/2025/03/29/nvidia_moores_law/&quot;&gt;&lt;i&gt;Nvidia GPU roadmap confirms it: Moore’s Law is dead and buried&lt;/i&gt;&lt;/a&gt; sets up the plan:&lt;br /&gt;
  1918. &lt;blockquote&gt;
  1919. Nvidia's path forward is clear: its compute platforms are only going to get bigger, denser, hotter and more power hungry from here on out. As a calorie deprived Huang put it during his press Q&amp;amp;A last week, the practical limit for a rack is however much power you can feed it.&lt;br /&gt;
  1920. &lt;br /&gt;
  1921. &quot;A datacenter is now 250 megawatts. That's kind of the limit per rack. I think the rest of it is just details,&quot; Huang said. &quot;If you said that a datacenter is a gigawatt, and I would say a gigawatt per rack sounds like a good limit.&quot;
  1922. &lt;/blockquote&gt;
  1923. Most current Nvidia racks are 60KW &lt;a href=&quot;https://www.theregister.com/2025/03/29/nvidia_moores_law/&quot;&gt;but&lt;/a&gt;:&lt;br /&gt;
  1924. &lt;blockquote&gt;
  1925. The NVL72 is a rackscale design inspired heavily by the hyperscalers with DC bus bars, power sleds, and networking out the front. And at 120kW of liquid cooled compute, deploying more than a few of these things in existing facilities gets problematic in a hurry. And this is only going to get even more difficult once Nvidia's 600kW monster racks make their debut in late 2027.&lt;br /&gt;
  1926. &lt;br /&gt;
  1927. This is where those &quot;&lt;a href=&quot;https://blogs.nvidia.com/blog/ai-factory/&quot;&gt;AI factories&lt;/a&gt;&quot; Huang keeps rattling on about come into play — purpose built datacenters designed in collaboration with partners like Schneider Electric to cope with the power and thermal demands of AI.
  1928. &lt;/blockquote&gt;
  1929. So Nvidia plans to increase the power draw per rack by 10x. The funds to build the &quot;AI factories&quot; to house them are being raised right now as David Gerard reports in &lt;a href=&quot;https://pivot-to-ai.com/2025/04/10/a16z-raising-fresh-20b-for-ai-bubble-investments/&quot;&gt;&lt;i&gt;a16z raising fresh $20b to keep the AI bubble pumping&lt;/i&gt;&lt;/a&gt;:&lt;br /&gt;
  1930. &lt;blockquote&gt;
  1931. Venture capital firm Andreessen Horowitz, affectionately known as a16z, is looking for investors to put a fresh $20 billion into AI startups. [&lt;a href=&quot;https://www.reuters.com/business/finance/andreessen-horowitz-seeks-raise-20-billion-megafund-amid-global-interest-us-ai-2025-04-08/&quot;&gt;Reuters&lt;/a&gt;]&lt;br /&gt;
  1932. &lt;br /&gt;
  1933. For perspective, that’s more than all US venture capital funding in the first three months of 2025, which was $17 billion. [&lt;a href=&quot;https://pitchbook.com/news/articles/a16z-raising-20b-more-capital-than-vc-2025-q1&quot;&gt;PitchBook&lt;/a&gt;]&lt;br /&gt;
  1934. &lt;br /&gt;
  1935. This means that a16z think there’s at least this much money sloshing about, sufficiently desperate for a return.&lt;br /&gt;
  1936. &lt;br /&gt;
  1937. PitchBook says a16z is talking up its links to the Trump administration to try to recruit investors — the pitch is to get on the inside of the trading! This may imply a less than robustly and strictly rule-of-law investment environment.
  1938. &lt;/blockquote&gt;
  1939. The power to supply them is natural gas on a huge scale. See for example, Reid Frazier's &lt;a href=&quot;https://www.alleghenyfront.org/homer-city-pennsylvania-gas-plant-data-center-ai/&quot;&gt;&lt;i&gt;LARGEST NATURAL GAS POWER PLANT IN THE COUNTRY, DATA CENTER COMING TO FORMER HOMER CITY COAL PLANT&lt;/i&gt;&lt;/a&gt;:&lt;br /&gt;
  1940. &lt;blockquote&gt;
  1941. The owners of a recently demolished coal-fired power plant announced the site will become a data center powered by the largest natural gas plant in the country. &lt;br /&gt;
  1942. &lt;br /&gt;
  1943. The Homer City Generating Station in Indiana County was &lt;a href=&quot;https://www.alleghenyfront.org/homer-city-pa-largest-coal-fired-power-plant-will-close-in-july/&quot;&gt;decommissioned in 2023&lt;/a&gt; and parts of it were &lt;a href=&quot;https://www.wtae.com/article/pennsylvania-demolition-homer-city-stacks/64257828&quot;&gt;imploded last month&lt;/a&gt;. It had been at one time the largest coal-fired power plant in Pennsylvania.&lt;br /&gt;
  1944. &lt;br /&gt;
  1945. The plant’s owners, Homer City Redevelopment, announced the site will become a 3,200-acre data center campus for artificial intelligence and other computing needs
  1946. &lt;/blockquote&gt;
  1947. The &quot;largest natural gas plant in the country&quot; will be pumping out carbon dioxide for its &lt;a href=&quot;http://www.bcri.com/Downloads/Power%20Generation%20Life%20Analysis.pdf&quot;&gt;predicted service life of 75 years&lt;/a&gt;, into the 3C period of 2100.&lt;br /&gt;
  1948. &lt;br /&gt;
  1949. Alas, it isn't just natural gas that will be pumping out carbon dioxide, as Ari Natter and Jennifer A Dlouhy report in &lt;a href=&quot;https://www.bloomberg.com/news/articles/2025-04-08/trump-order-seeks-to-tap-coal-power-in-quest-to-dominate-ai&quot;&gt;&lt;i&gt;Trump Signs Orders to Expand Coal Power, Invoking AI Boom&lt;/i&gt;&lt;/a&gt;:&lt;br /&gt;
  1950. &lt;blockquote&gt;
  1951. Taken together, the measures represent a sweeping attempt to ensure coal remains part of the US electricity mix, despite its higher greenhouse gas emissions and frequently greater cost when compared to natural gas or solar power.&lt;br /&gt;
  1952. &lt;br /&gt;
  1953. The effort also underscores Trump’s commitment to tapping America’s coal resources as a source of both electricity to run data centers and heat to forge steel. The president and administration officials have made clear boosting coal-fired power is a priority, one they see as intertwined with national security and the US standing in a global competition to dominate the artificial intelligence industry.
  1954. &lt;/blockquote&gt;
  1955. Carbon emissions aren't the only problem this mania causes, as Luke Barratt &lt;i&gt;et al&lt;/i&gt; report in &lt;a href=&quot;https://www.source-material.org/amazon-microsoft-google-trump-data-centres-water-use/&quot;&gt;&lt;i&gt;Big Tech’s data centre push will take water from the world’s driest areas&lt;/i&gt;&lt;/a&gt;:&lt;br /&gt;
  1956. &lt;blockquote&gt;
  1957. Amazon, Microsoft and Google are operating data centres that use vast amounts of water in some of the world’s driest areas and are building many more, an investigation by SourceMaterial and The Guardian has found.&lt;br /&gt;
  1958. &lt;br /&gt;
  1959. With US President Donald Trump pledging to support them, the three technology giants are planning hundreds of data centres in the US and across the globe, with a potentially huge impact on populations already living with water scarcity. &lt;br /&gt;
  1960. &lt;br /&gt;
  1961. “The question of water is going to become crucial,” said Lorena Jaume-Palasí, founder of The Ethical Tech Society. “Resilience from a resource perspective is going to be very difficult for those communities.”&lt;br /&gt;
  1962. &lt;br /&gt;
  1963. Efforts by Amazon, the world’s biggest online retailer, to mitigate its water use have sparked opposition from inside the company, SourceMaterial’s investigation found, with one of its own sustainability experts warning that its plans are “not ethical”.
  1964. &lt;/blockquote&gt;
  1965. &lt;a href=&quot;https://www.source-material.org/amazon-microsoft-google-trump-data-centres-water-use/&quot;&gt;And&lt;/a&gt;:
  1966. &lt;blockquote&gt;
  1967. Amazon’s three proposed data centres in Aragon, northern Spain—each next to an existing Amazon data centre—are licensed to use an estimated 755,720 cubic metres of water a year, enough to irrigate more than 200 hectares (500 acres) of corn, one of the region’s main crops.&lt;br /&gt;
  1968. &lt;br /&gt;
  1969. In practice, the water usage will be even higher as that figure doesn’t take into account water used in generating electricity to power the new installations, said Aaron Wemhoff, an energy efficiency specialist at Villanova University in Pennsylvania.&lt;br /&gt;
  1970. &lt;br /&gt;
  1971. Between them, Amazon’s planned Aragon data centres will use more electricity than the entire region currently consumes. Meanwhile, Amazon in December asked the regional government for permission to increase water consumption at its three existing data centres by 48 per cent.&lt;br /&gt;
  1972. &lt;br /&gt;
  1973. Opponents have accused the company of being undemocratic by trying to rush through its application over the Christmas period. More water is needed because “climate change will lead to an increase in global temperatures and the frequency of extreme weather events, including heat waves”, Amazon wrote in its application.
  1974. &lt;/blockquote&gt;
  1975. Right. We need to use more water to cope with the &quot;extreme weather events, including heat waves&quot; we are causing, which will allow us to cause more &quot;extreme weather events&quot; which will mean we need more water! It is a vicious cycle.&lt;br /&gt;
  1976. &lt;br /&gt;
  1977. Is there really a demand for these monsters? One of Nvidia's big customers is &lt;a href=&quot;https://www.wheresyoured.at/core-incompetency/&quot;&gt;CoreWeave&lt;/a&gt;:
  1978. &lt;blockquote&gt;
  1979. NVIDIA, &lt;a href=&quot;https://www.coreweave.com/blog/coreweave-becomes-the-first-cloud-provider-with-generally-available-nvidia-gb200-nvl72-instances&quot;&gt;which has given CoreWeave priority access to its chips&lt;/a&gt;. &lt;a href=&quot;https://techcrunch.com/2025/03/05/ahead-of-a-possible-4-billion-ipo-coreweaves-founders-already-pocketed-488-million/&quot;&gt;As CoreWeave’s own S-1 notes, it was “the first cloud provider to make NVIDIA GB200 NVL72-based instances generally available,” and “among the first cloud providers to deploy high-performance infrastructure with NVIDIA H100, H200, and GH200.”&lt;/a&gt;&lt;br /&gt;
  1980. &lt;br /&gt;
  1981. &lt;a href=&quot;https://techcrunch.com/2025/03/05/ahead-of-a-possible-4-billion-ipo-coreweaves-founders-already-pocketed-488-million/&quot;&gt;CoreWeave owns over 250,000 NVIDIA GPUs&lt;/a&gt; across 32 data centers, supported by more than 260MW of active power
  1982. &lt;/blockquote&gt;
  1983. Before CoreWeave's recent IPO  Ed Zitron wrote &lt;a href=&quot;https://www.wheresyoured.at/core-incompetency/&quot;&gt;&lt;i&gt;CoreWeave Is A Time Bomb&lt;/i&gt;&lt;/a&gt;:&lt;br /&gt;
  1984. &lt;blockquote&gt;
  1985. In my years writing this newsletter I have come across few companies as rotten as &lt;a href=&quot;http://www.coreweave.com/&quot;&gt;CoreWeave&lt;/a&gt; — an &quot;AI cloud provider&quot; that sells GPU compute to AI companies looking to run or train their models. &lt;br /&gt;
  1986. &lt;br /&gt;
  1987. CoreWeave had &lt;a href=&quot;https://www.marketwatch.com/story/what-is-coreweave-the-cloud-services-provider-expected-to-go-public-next-week-615a4d85i&quot;&gt;intended to go public last week&lt;/a&gt;, with an initial valuation of $35bn. While it’s hardly a recognizable name — like, say, OpenAI, or Microsoft, or Nvidia — this company is worth observing, if not for the fact that it’s arguably the first major IPO that we’ve seen from the current generative AI hype bubble, and undoubtedly the biggest.
  1988. &lt;/blockquote&gt;
  1989. It seems a lot of investors agreed with Zitron. The IPO went so badly that Nvidia had to step in and support it.  Craig Coben has the story in &lt;a href=&quot;https://www.ft.com/content/519006a9-82c6-4939-afa1-58921a399105&quot;&gt;&lt;i&gt;CoreWeave’s IPO tested the waters — and missed the mark&lt;/i&gt;&lt;/a&gt;:&lt;br /&gt;
  1990. &lt;blockquote&gt;
  1991. The initial public offering of AI infrastructure firm CoreWeave, initially targeting a $2.7bn raise at $47-55 per share, &lt;a href=&quot;https://www.ft.com/content/b1e962a8-ec26-41f8-a491-ad127349c1b4&quot;&gt;was slashed to $1.5bn at $40 per share&lt;/a&gt;. Even then, the deal barely limped across the finish line, thanks to a last-minute &lt;a href=&quot;https://www.reuters.com/markets/deals/nvidia-backed-coreweaves-shares-likely-open-up-25-above-ipo-price-2025-03-28/&quot;&gt;$250mn “anchor” order from Nvidia&lt;/a&gt;. The offering reportedly ended up with just three investors holding 50 per cent of the stock, and it seems to have required some stabilisation from lead bank Morgan Stanley to avoid a first-day drop. Hardly a textbook success.
  1992. &lt;/blockquote&gt;
  1993. Bryce Elder lists reasons why investors might be skeptical in &lt;a href=&quot;https://www.ft.com/content/a96d65c1-8ba7-4055-b952-61c5b5d500fc&quot;&gt;&lt;i&gt;Eight odd things in CoreWeave’s IPO prospectus&lt;/i&gt;&lt;/a&gt;. Elder starts with an analogy:&lt;br /&gt;
  1994. &lt;blockquote&gt;
  1995. Imagine a caravan maker. It sells caravans to a caravan park that only buys one type of caravan. The caravan park leases much of its land from another caravan park. The first caravan park has two big customers. One of the big customers is the caravan maker. The other big customer is the caravan maker’s biggest customer. The biggest customer of the second caravan park is the first caravan park.&lt;br /&gt;
  1996. &lt;br /&gt;
  1997. Sorry, not caravans. GPUs.
  1998. &lt;/blockquote&gt;
  1999. Another reason investors might be skeptical is that technologies advance in S-curves. Frank Landymore's &lt;a href=&quot;https://futurism.com/ai-researchers-tech-industry-dead-end&quot;&gt;&lt;i&gt;Majority of AI Researchers Say Tech Industry Is Pouring Billions Into a Dead End&lt;/i&gt;&lt;/a&gt; suggests that generative AI has hit the flat part of the S-curve:&lt;br /&gt;
  2000. &lt;blockquote&gt;
  2001. You can only throw so much money at a problem.&lt;br /&gt;
  2002. &lt;br /&gt;
  2003. This, more or less, is the line being taken by AI researchers in a &lt;a href=&quot;https://aaai.org/about-aaai/presidential-panel-on-the-future-of-ai-research/&quot;&gt;recent survey&lt;/a&gt;. Asked whether &quot;scaling up&quot; current AI approaches could lead to achieving artificial general intelligence (AGI), or a general purpose AI that matches or surpasses human cognition, an overwhelming 76 percent of respondents said it was &quot;unlikely&quot; or &quot;very unlikely&quot; to succeed.
  2004. &lt;br /&gt;
  2005. &quot;The vast investments in scaling, unaccompanied by any comparable efforts to understand what was going on, always seemed to me to be misplaced,&quot; Stuart Russel, a computer scientist at UC Berkeley who helped organize the report, &lt;a href=&quot;https://www.newscientist.com/article/2471759-ai-scientists-are-sceptical-that-modern-models-will-lead-to-agi/&quot;&gt;told NewScientist&lt;/a&gt;. &quot;I think that, about a year ago, it started to become obvious to everyone that the benefits of scaling in the conventional sense had plateaued.&quot;
  2006. &lt;/blockquote&gt;
  2007. Stanford shares Berkeley's concerns as Dan Robinson reports in &lt;a href=&quot;https://www.theregister.com/2025/04/11/stanford_ai_report/&quot;&gt;&lt;i&gt;Billions pour into AI as emissions rise, returns stay pitiful, say Stanford boffins&lt;/i&gt;&lt;/a&gt;:&lt;br /&gt;
  2008. &lt;blockquote&gt;
  2009. AI continues to improve – at least according to benchmarks. But the promised benefits have largely yet to materialize while models are increasing in size and becoming more computationally demanding, and greenhouse gas emissions from AI training continue to rise.&lt;br /&gt;
  2010. &lt;br /&gt;
  2011. These are some of the takeaways from the &lt;a href=&quot;https://hai-production.s3.amazonaws.com/files/hai_ai_index_report_2025.pdf&quot;&gt;AI Index Report 2025 [PDF]&lt;/a&gt;, a lengthy and in-depth publication from Stanford University's Institute for Human-Centered AI (HAI) that covers development, investment, adoption, governance and even global attitudes towards artificial intelligence, giving a snapshot of the current state of play.&lt;br /&gt;
  2012. ...&lt;br /&gt;
  2013. However, HAI highlights the enormous level of investment still being pumped into the sector, with global corporate AI investment reaching $252.3 billion in 2024, up 26 percent for the year. Most of this is in the US, which hit $109.1 billion, nearly 12 times higher than China's $9.3 billion and 24 times the UK's $4.5 billion, it says.&lt;br /&gt;
  2014. &lt;br /&gt;
  2015. Despite all this investment, &quot;most companies that report financial impacts from using AI within a business function estimate the benefits as being at low levels,&quot; the report writes.&lt;br /&gt;
  2016. &lt;br /&gt;
  2017. It says that 49 percent of organizations using AI in service operations reported cost savings, followed by supply chain management (43 percent) and software engineering (41 percent), but in most cases, the cost savings are less than 10 percent.&lt;br /&gt;
  2018. &lt;br /&gt;
  2019. When it comes to revenue gains, 71 percent of respondents using AI in marketing and sales reported gains, while 63 percent in supply chain management and 57 percent in service operations, but the most common level of revenue increase is less than 5 percent.&lt;br /&gt;
  2020. &lt;br /&gt;
  2021. Meanwhile, despite the modest returns, the HAI report warns that the amount of compute used to train top-notch AI models is doubling approximately every 5 months, the size of datasets required for LLM training is doubling every eight months, and the energy consumed for training is doubling annually.&lt;br /&gt;
  2022. &lt;br /&gt;
  2023. This is leading to rapidly increasing greenhouse gas emissions resulting from AI training, the report finds. It says that early AI models such as AlexNet over a decade ago caused only modest CO₂ emissions of 0.01 tons, while GPT-4 (2023) was responsible for emitting 5,184 tons, and Llama 3.1 405B (2024) pumping out 8,930 tons. This compares with about 18 tons of carbon a year the average American emits, it claims.
  2024. &lt;/blockquote&gt;
  2025. Recent developments in China have &lt;a href=&quot;https://futurism.com/ai-researchers-tech-industry-dead-end&quot;&gt;validated the researchers concerns&lt;/a&gt;:&lt;br /&gt;
  2026. &lt;blockquote&gt;
  2027. The premise that AI could be indefinitely improved by scaling was always on shaky ground. Case in point, the tech sector's recent existential crisis &lt;a href=&quot;https://futurism.com/silicon-valley-shambles-chinese-startup-deepseek&quot;&gt;precipitated by the Chinese startup DeepSeek&lt;/a&gt;, whose AI model could go toe-to-toe with the West's flagship, multibillion-dollar chatbots at purportedly a fraction of the training cost and power.&lt;br /&gt;
  2028. &lt;br /&gt;
  2029. Of course, the writing had been on the wall before that. In November last year, reports indicated that OpenAI researchers discovered that the upcoming version of its GPT large language model &lt;a href=&quot;https://www.theinformation.com/articles/openai-shifts-strategy-as-rate-of-gpt-ai-improvements-slows&quot;&gt;displayed significantly less improvement&lt;/a&gt;, and in some cases, no improvements at all than previous versions did over their predecessors.
  2030. &lt;/blockquote&gt;
  2031. In fact, the models can get &lt;i&gt;worse&lt;/i&gt;, as Maxwell Zeff reports in &lt;a href=&quot;https://techcrunch.com/2025/04/18/openais-new-reasoning-ai-models-hallucinate-more/&quot;&gt;&lt;i&gt;OpenAI’s new reasoning AI models hallucinate more&lt;/i&gt;&lt;/a&gt;:&lt;br /&gt;
  2032. &lt;blockquote&gt;
  2033. According to OpenAI’s internal tests, o3 and o4-mini, which are so-called reasoning models, hallucinate more often than the company’s previous reasoning models — o1, o1-mini, and o3-mini — as well as OpenAI’s traditional, “non-reasoning” models, such as GPT-4o.&lt;br /&gt;
  2034. &lt;br /&gt;
  2035. Perhaps more concerning, the ChatGPT maker doesn’t really know why it’s happening.&lt;br /&gt;
  2036. &lt;br /&gt;
  2037. In its technical report for o3 and o4-mini, OpenAI writes that “more research is needed” to understand why hallucinations are getting worse as it scales up reasoning models. O3 and o4-mini perform better in some areas, including tasks related to coding and math. But because they “make more claims overall,” they’re often led to make “more accurate claims as well as more inaccurate/hallucinated claims,” per the report.&lt;br /&gt;
  2038. &lt;br /&gt;
  2039. OpenAI found that o3 hallucinated in response to 33% of questions on PersonQA, the company’s in-house benchmark for measuring the accuracy of a model’s knowledge about people. That’s roughly double the hallucination rate of OpenAI’s previous reasoning models, o1 and o3-mini, which scored 16% and 14.8%, respectively. O4-mini did even worse on PersonQA — hallucinating 48% of the time.
  2040. &lt;/blockquote&gt;
  2041. It may well turn out that people put more value on being right than being plausible.&lt;br /&gt;
  2042. &lt;br /&gt;
  2043. Increasingly, there are other signs that the current, costly, proprietary AI approach is coming to an end. For example, we have Matt Asay's &lt;a href=&quot;https://www.infoworld.com/article/3960764/deepseeks-open-source-movement.html&quot;&gt;&lt;i&gt;DeepSeek’s open source movement&lt;/i&gt;&lt;/a&gt;:&lt;br /&gt;
  2044. &lt;blockquote&gt;
  2045. It’s increasingly common in AI circles to refer to the “&lt;a href=&quot;https://www.google.com/search?q=deepseek+moment&amp;amp;rlz=1C5GCCM_en&amp;amp;oq=deepseek+moment&amp;amp;gs_lcrp=EgZjaHJvbWUyCQgAEEUYORiABDIHCAEQABiABDIHCAIQABiABDIHCAMQABiABDIICAQQABgWGB4yCggFEAAYgAQYogQyCggGEAAYgAQYogQyBwgHEAAY7wUyCggIEAAYgAQYogQyBwgJEAAY7wXSAQg1MTU1ajBqNKgCALACAA&amp;amp;sourceid=chrome&amp;amp;ie=UTF-8&quot;&gt;DeepSeek moment&lt;/a&gt;,” but calling it a moment fundamentally misunderstands its significance. DeepSeek didn’t just have a moment. It’s now very much a movement, one that will frustrate all efforts to contain it. DeepSeek, and the open source AI ecosystem surrounding it, has rapidly evolved from a brief snapshot of technological brilliance into something much bigger—and much harder to stop. Tens of thousands of developers, from seasoned researchers to passionate hobbyists, are now working on enhancing, tuning, and extending these open source models in ways no centralized entity could manage alone.&lt;br /&gt;
  2046. &lt;br /&gt;
  2047. For example, it’s perhaps not surprising that &lt;a href=&quot;https://siliconangle.com/2025/01/28/hugging-face-wants-reverse-engineer-deepseeks-r1-reasoning-model/&quot;&gt;Hugging Face is actively attempting&lt;/a&gt; to reverse engineer and publicly disseminate DeepSeek’s R1 model. Hugging Face, while important, is just one company, just one platform. But Hugging Face has attracted hundreds of thousands of developers who actively contribute to, adapt, and build on open source models, driving AI innovation at a speed and scale unmatched even by the most agile corporate labs.
  2048. &lt;/blockquote&gt;
  2049. For another example, there is Kyle Orland's &lt;a href=&quot;https://arstechnica.com/ai/2025/04/microsoft-researchers-create-super%e2%80%91efficient-ai-that-uses-up-to-96-less-energy/&quot;&gt;&lt;i&gt; Microsoft’s “1‑bit” AI model runs on a CPU only, while matching larger systems&lt;/i&gt;&lt;/a&gt;:&lt;br /&gt;
  2050. &lt;blockquote&gt;
  2051. Now, researchers at Microsoft's &lt;a href=&quot;https://www.microsoft.com/en-us/research/group/general-artificial-intelligence/&quot;&gt;General Artificial Intelligence group&lt;/a&gt; have &lt;a href=&quot;https://huggingface.co/microsoft/bitnet-b1.58-2B-4T&quot;&gt;released a new neural network model&lt;/a&gt; that works with just three distinct weight values: -1, 0, or 1. Building on top of previous work Microsoft Research published in 2023, the new model's &quot;ternary&quot; architecture reduces overall complexity and &quot;substantial advantages in computational efficiency,&quot; the researchers write, allowing it to &lt;a href=&quot;https://github.com/microsoft/BitNet&quot;&gt;run effectively on a simple desktop CPU&lt;/a&gt;. And despite the massive reduction in weight precision, the researchers claim that the model &quot;can achieve performance comparable to leading open-weight, full-precision models of similar size across a wide range of tasks.&quot;
  2052. &lt;/blockquote&gt;
  2053. The problem is that the driving force behind the data center boom isn't the AI researchers, it is the real estate people.  David Gerard's &lt;a href=&quot;https://pivot-to-ai.com/2025/04/04/china-massively-overbuilds-empty-ai-data-centres/&quot;&gt;&lt;i&gt;China massively overbuilds empty AI data centres&lt;/i&gt;&lt;/a&gt;:&lt;br /&gt;
  2054. &lt;blockquote&gt;
  2055. ChatGPT came out in 2022, and the Chinese government declared AI infrastructure a national priority. Over 500 new data centres were announced in 2023 and 2024. Private investors went all-in.&lt;br /&gt;
  2056. &lt;br /&gt;
  2057. Demand for the data centres turns out not to be there. Around 80% are not actually in use. [&lt;a href=&quot;https://www.technologyreview.com/2025/03/26/1113802/china-ai-data-centers-unused/&quot;&gt;MIT Technology Review&lt;/a&gt;]&lt;br /&gt;
  2058. &lt;br /&gt;
  2059. The business model was to rent GPUs. &lt;a href=&quot;https://pivot-to-ai.com/2025/01/28/deepseek-slaps-openai-tech-stocks-crash/&quot;&gt;DeepSeek&lt;/a&gt; knifed that, much as it did OpenAI. There’s now a lot of cheap GPU in China. Data centre projects are having trouble finding new investment.&lt;br /&gt;
  2060. &lt;br /&gt;
  2061. The Chinese data centre boom was a real estate deal — many investors pivoted straight from real estate to AI.
  2062. &lt;/blockquote&gt;
  2063. Paul Kunert talked to someone with long experience of the data center real estate business in &lt;a href=&quot;https://www.theregister.com/2025/04/14/datacenter_spending_ai/&quot;&gt;&lt;i&gt;Dot com era crash on the cards for AI datacenter spending? It's a 'risk'&lt;/i&gt;&lt;/a&gt;:&lt;br /&gt;
  2064. &lt;blockquote&gt;
  2065. Having lived through the early days of the internet frenzy, Fabrice Coquio, senior veep at Digital Realty, which bills itself as the world's largest provider of cloud and carrier-neutral datacenter, colocation and interconnection services, is perhaps better placed than most to venture an opinion. Is there a bubble?&lt;br /&gt;
  2066. &lt;br /&gt;
  2067. &quot;I have been in this industry for 25 years, so I've seen some ups and downs. At the moment, definitely that's on the very bullish side, particularly because of what people believe will be required for AI,&quot; he tells The Register.&lt;br /&gt;
  2068. &lt;br /&gt;
  2069. Grabbing a box of Kleenex tissues, he quips that back at the turn of the millennium, if investors were told the internet was inside they would have rushed to buy it. &quot;Today I am telling you there is AI inside. So buy it.&quot;&lt;br /&gt;
  2070. &lt;br /&gt;
  2071. &quot;Is there a bubble? Potentially? I see the risk, because when some of the traditional investments in real estate – like housing, logistics and so on – are not that important, people are looking to invest their amazing capacity of available funds in new segments, and they say, 'Oh, why not datacenters?'&quot;&lt;br /&gt;
  2072. &lt;br /&gt;
  2073. He adds: &quot;In the UK, in France, in Germany, you've got people coming from nowhere having no experiences… that have no idea about what AI and datacenters are really and still investing in them.&lt;br /&gt;
  2074. &lt;br /&gt;
  2075. &quot;It's the expression of a typical bubble. At the same time, is the driver of AI a big thing? Yes… [with] AI [there] is a sense of incredible productivity for companies and then for individuals. And this might change drastically the way we work, we operate, and we deliver something in a more efficient way.
  2076. &lt;/blockquote&gt;
  2077. The &quot;&lt;a href=&quot;https://blog.dshr.org/2018/01/it-isnt-about-technology.html&quot;&gt;slow AIs&lt;/a&gt;&quot; that run the major AI companies hallucinated a future where scaling continued to work and have already sunk vast sums into data centers. The &quot;slow AIs&quot; &lt;a href=&quot;https://futurism.com/ai-researchers-tech-industry-dead-end&quot;&gt;&lt;i&gt;can't be wrong&lt;/i&gt;&lt;/a&gt;:&lt;br /&gt;
  2078. &lt;blockquote&gt;
  2079. Nonetheless, if Microsoft's commitment to still spending tens of billions of dollars on data centers is any indication, brute force scaling is still going to be the favored MO for the titans of the industry — while it'll be left to the scrappier startups to scrounge for ways to do more with less.
  2080. &lt;/blockquote&gt;
  2081. Sophia Chen's &lt;a href=&quot;https://www.nature.com/articles/d41586-025-01113-z&quot;&gt;&lt;i&gt;Data centres will use twice as much energy by 2030 — driven by AI&lt;/i&gt;&lt;/a&gt; shows the International Energy Agency believes the data centers will get built:&lt;br /&gt;
  2082. &lt;blockquote&gt;
  2083. The IEA’s models project that data centres will use 945 terawatt-hours (TWh) in 2030, roughly equivalent to the current annual electricity consumption of Japan. By comparison, data centres consumed 415 TWh in 2024, roughly 1.5% of the world’s total electricity consumption (see ‘Global electricity growth’).&lt;br /&gt;
  2084. &lt;br /&gt;
  2085. The projections focus mostly on data centres in general, which also run computing tasks other than AI — although the agency estimated the proportion of data-centre servers devoted to AI. They found that such servers accounted for 24% of server electricity demand and 15% of total data-centre energy demand in 2024.&lt;br /&gt;
  2086. &lt;/blockquote&gt;
  2087. &lt;table cellpadding=&quot;0&quot; cellspacing=&quot;0&quot; class=&quot;tr-caption-container&quot; style=&quot;float: right;&quot;&gt;&lt;tbody&gt;&lt;tr&gt;&lt;td style=&quot;text-align: center;&quot;&gt;&lt;a href=&quot;https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjMK4VRzosDa0UooAkAfF00GZEVLFasxYZv38-1o9Vpxh__H6iZUN5SAFMZoTVLyhICSCCyAm5J7Nl0T-5bdoXQIgLGKLvqDD-bQ35ThwcSs7g70Qe5sj2YroULiRiOOe30Uy1s4mW2p2QxISmfmwqDAvOlQ2l9zFIdsGYaEKeenUt8s924BPJq1ia93TaF/s1524/IAE-GlobalGrowt.png&quot; style=&quot;clear: right; margin-bottom: 1em; margin-left: auto; margin-right: auto;&quot;&gt;&lt;img border=&quot;0&quot; height=&quot;127&quot; src=&quot;https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjMK4VRzosDa0UooAkAfF00GZEVLFasxYZv38-1o9Vpxh__H6iZUN5SAFMZoTVLyhICSCCyAm5J7Nl0T-5bdoXQIgLGKLvqDD-bQ35ThwcSs7g70Qe5sj2YroULiRiOOe30Uy1s4mW2p2QxISmfmwqDAvOlQ2l9zFIdsGYaEKeenUt8s924BPJq1ia93TaF/w200-h127/IAE-GlobalGrowt.png&quot; width=&quot;200&quot; /&gt;&lt;/a&gt;&lt;/td&gt;&lt;/tr&gt;&lt;tr&gt;&lt;td class=&quot;tr-caption&quot; style=&quot;text-align: center;&quot;&gt;&lt;a href=&quot;https://www.nature.com/articles/d41586-025-01113-z&quot;&gt;Source&lt;/a&gt;&lt;/td&gt;&lt;/tr&gt;&lt;/tbody&gt;&lt;/table&gt;
  2088. AI boosters will point out that there are &lt;a href=&quot;https://www.nature.com/articles/d41586-025-01113-z&quot;&gt;other demand growth culprits&lt;/a&gt;, but:&lt;br /&gt;
  2089. &lt;blockquote&gt;
  2090. Alex de Vries, a researcher at the Free University of Amsterdam and the founder of Digiconomist, who was not involved with the report, thinks this is an underestimate. The report “is a bit vague when it comes to AI specifically”, he says.&lt;br /&gt;
  2091. &lt;br /&gt;
  2092. Even with these uncertainties, “we should be mindful about how much energy is ultimately being consumed by all these data centres”, says de Vries. “Regardless of the exact number, we’re talking several percentage of our global electricity consumption.”
  2093. &lt;/blockquote&gt;
  2094. &lt;table cellpadding=&quot;0&quot; cellspacing=&quot;0&quot; class=&quot;tr-caption-container&quot; style=&quot;float: right;&quot;&gt;&lt;tbody&gt;&lt;tr&gt;&lt;td style=&quot;text-align: center;&quot;&gt;&lt;a href=&quot;https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjMK4VRzosDa0UooAkAfF00GZEVLFasxYZv38-1o9Vpxh__H6iZUN5SAFMZoTVLyhICSCCyAm5J7Nl0T-5bdoXQIgLGKLvqDD-bQ35ThwcSs7g70Qe5sj2YroULiRiOOe30Uy1s4mW2p2QxISmfmwqDAvOlQ2l9zFIdsGYaEKeenUt8s924BPJq1ia93TaF/s1524/IAE-GlobalGrowt.png&quot; style=&quot;clear: right; margin-bottom: 1em; margin-left: auto; margin-right: auto;&quot;&gt;&lt;img border=&quot;0&quot; height=&quot;127&quot; src=&quot;https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjMK4VRzosDa0UooAkAfF00GZEVLFasxYZv38-1o9Vpxh__H6iZUN5SAFMZoTVLyhICSCCyAm5J7Nl0T-5bdoXQIgLGKLvqDD-bQ35ThwcSs7g70Qe5sj2YroULiRiOOe30Uy1s4mW2p2QxISmfmwqDAvOlQ2l9zFIdsGYaEKeenUt8s924BPJq1ia93TaF/w200-h127/IAE-GlobalGrowt.png&quot; width=&quot;200&quot; /&gt;&lt;/a&gt;&lt;/td&gt;&lt;/tr&gt;&lt;tr&gt;&lt;td class=&quot;tr-caption&quot; style=&quot;text-align: center;&quot;&gt;&lt;a href=&quot;https://www.nature.com/articles/d41586-025-01113-z&quot;&gt;Source&lt;/a&gt;&lt;/td&gt;&lt;/tr&gt;&lt;/tbody&gt;&lt;/table&gt;
  2095. James Temple critiques the &lt;a href=&quot;https://www.iea.org/reports/energy-and-ai&quot;&gt;International Energy Agency report&lt;/a&gt; in &lt;a href=&quot;https://www.technologyreview.com/2025/04/10/1114912/why-the-climate-promises-of-ai-sound-a-lot-like-carbon-offsets/&quot;&gt;&lt;i&gt;Why the climate promises of AI sound a lot like carbon offsets&lt;/i&gt;&lt;/a&gt;:&lt;br /&gt;
  2096. &lt;blockquote&gt;
  2097. There are reasonable arguments to suggest that AI tools may eventually help reduce emissions, as the IEA report underscores. But what we know for sure is that they’re driving up energy demand and emissions today—especially in the regional pockets where data centers are clustering. &lt;br /&gt;
  2098. &lt;br /&gt;
  2099. So far, these facilities, which generally run around the clock, are &lt;a href=&quot;https://www.chevron.com/newsroom/2025/q1/natural-gas-is-helping-power-the-use-of-artificial-intelligence&quot;&gt;substantially powered&lt;/a&gt; through natural-gas turbines, which produce significant levels of planet-warming emissions. Electricity demands are rising so fast that developers are proposing to build &lt;a href=&quot;https://www.datacenterdynamics.com/en/news/dominion-energy-seeks-approval-for-1gw-virginia-gas-plant-amid-data-center-boom/&quot;&gt;new gas plants&lt;/a&gt; and convert &lt;a href=&quot;https://apnews.com/article/technology-ai-natural-gas-electricity-pennsylvania-450534992fab8dd3527b64b92614259e&quot;&gt;retired coal plants&lt;/a&gt; to supply the buzzy industry.
  2100. &lt;/blockquote&gt;
  2101. If the data centers get built, they will add to carbon emissions and push us closer to 3C sooner.  Of course, this investment in data centers needs to generate a return, but it may well turn out that the market isn't willing to pay enough for Ghibli-style memes to provide it.  Ed Zitron has been hammering away at this point, for example in &lt;a href=&quot;https://www.wheresyoured.at/wheres-the-money/&quot;&gt;&lt;i&gt;There Is No AI Revolution&lt;/i&gt;&lt;/a&gt;:&lt;br /&gt;
  2102. &lt;blockquote&gt;
  2103. Putting aside the hype and bluster, OpenAI — as with all generative AI model developers — loses money on every single prompt and output. Its products do not scale like traditional software, in that the more users it gets, the more expensive its services are to run because its models are so compute-intensive.&lt;br /&gt;
  2104. &lt;br /&gt;
  2105. For example, &lt;a href=&quot;https://www.reuters.com/technology/artificial-intelligence/openais-weekly-active-users-surpass-400-million-2025-02-20/&quot;&gt;ChatGPT having 400 million weekly active users&lt;/a&gt; is not the same thing as a traditional app like Instagram or Facebook having that many users. The cost of serving a regular user of an app like Instagram is significantly smaller, because these are, effectively, websites with connecting APIs, images, videos and user interactions. These platforms aren’t innately compute-heavy, at least to the same extent as generative AI, and so you don’t require the same level of infrastructure to support the same amount of people. &lt;br /&gt;
  2106. &lt;br /&gt;
  2107. Conversely, generative AI requires expensive-to-buy and expensive-to-run GPUs, both for inference and training the models themselves. The GPUs must be run at full tilt for both inference and training models, which shortens their lifespan, while also consuming ungodly amounts of energy. And surrounding that GPU is the rest of the computer, which is usually highly-specced, and thus, expensive.
  2108. &lt;/blockquote&gt;
  2109. Zitron makes an &lt;a href=&quot;https://www.wheresyoured.at/openai-is-a-systemic-risk-to-the-tech-industry-2/&quot;&gt;important point&lt;/a&gt;:&lt;br /&gt;
  2110. &lt;blockquote&gt;
  2111. OpenAI, &lt;a href=&quot;https://www.wheresyoured.at/wheres-the-money/&quot;&gt;as I've written before&lt;/a&gt;, is effectively the entire generative AI industry, with its nearest competitor being less than five percent of its 500 million weekly active users.
  2112. &lt;/blockquote&gt;
  2113. &lt;table cellpadding=&quot;0&quot; cellspacing=&quot;0&quot; class=&quot;tr-caption-container&quot; style=&quot;float: right;&quot;&gt;&lt;tbody&gt;&lt;tr&gt;&lt;td style=&quot;text-align: center;&quot;&gt;&lt;a href=&quot;https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiUO_otUINB3D6lVN-TpU1X4uBKyAutNvrDAbprDZx2p3zlt5Kopvu4AMsA3JghopXuXAXII5vI8eFDWwc7BHssUh7BmcOm-gs_h6JUaWTg-sojGA9LNRa3KdYMJJAb_L9cxmxsB0rP0QolMuVlRVOoyMZ7jA37J3CqUeO9XczwjFXTsbFUD8V-9d3ehAWy/s1438/OpenAI.png&quot; style=&quot;clear: right; margin-bottom: 1em; margin-left: auto; margin-right: auto;&quot;&gt;&lt;img border=&quot;0&quot; height=&quot;181&quot; src=&quot;https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiUO_otUINB3D6lVN-TpU1X4uBKyAutNvrDAbprDZx2p3zlt5Kopvu4AMsA3JghopXuXAXII5vI8eFDWwc7BHssUh7BmcOm-gs_h6JUaWTg-sojGA9LNRa3KdYMJJAb_L9cxmxsB0rP0QolMuVlRVOoyMZ7jA37J3CqUeO9XczwjFXTsbFUD8V-9d3ehAWy/w200-h181/OpenAI.png&quot; width=&quot;200&quot; /&gt;&lt;/a&gt;&lt;/td&gt;&lt;/tr&gt;&lt;tr&gt;&lt;td class=&quot;tr-caption&quot; style=&quot;text-align: center;&quot;&gt;Source&lt;/td&gt;&lt;/tr&gt;&lt;/tbody&gt;&lt;/table&gt;
  2114. Ed Zitron has been arguing for more than a year that OpenAI's finances simply don't make sense, and in &lt;a href=&quot;https://www.wheresyoured.at/openai-is-a-systemic-risk-to-the-tech-industry-2/&quot;&gt;&lt;i&gt;OpenAI Is A Systemic Risk To The Tech Industry&lt;/i&gt;&lt;/a&gt; he makes the case in exquisite detail and concludes:&lt;br /&gt;
  2115. &lt;blockquote&gt;
  2116. Even in a hysterical bubble where everybody is agreeing that this is the future, OpenAI currently requires more money and more compute than is reasonable to acquire. &lt;i&gt;Nobody&lt;/i&gt; has ever raised as much as OpenAI needs to, and based on the sheer amount of difficulty that SoftBank is having in raising the funds to meet &lt;i&gt;the lower tranche ($10bn) of its commitment&lt;/i&gt;, it may simply not be possible for this company to continue.&lt;br /&gt;
  2117. &lt;br /&gt;
  2118. Even with &lt;i&gt;extremely&lt;/i&gt; preferential payment terms — months-long deferred payments, for example — at some point somebody is going to need to get paid.&lt;br /&gt;
  2119. &lt;br /&gt;
  2120. I will give Sam Altman credit. He's found many partners to shoulder the burden of the rotten economics of OpenAI, with Microsoft, Oracle, Crusoe and CoreWeave handling the up-front costs of building the infrastructure, SoftBank finding the investors for its monstrous round, and the tech media mostly handling his marketing for him.&lt;br /&gt;
  2121. &lt;br /&gt;
  2122. He is, however, over-leveraged. OpenAI has never been forced to stand on its own two feet or focus on efficiency, and I believe the constant enabling of its ugly, nonsensical burnrate has doomed this company. OpenAI has acted like it’ll always have more money and compute, and that people will always believe its bullshit, mostly because up until recently &lt;i&gt;everybody has&lt;/i&gt;.&lt;br /&gt;
  2123. &lt;br /&gt;
  2124. OpenAI cannot &quot;make things cheaper&quot; at this point, because the money has always been there to make things more expensive, as has the compute to make larger language models that burn billions of dollars a year. This company is not built to reduce its footprint in any way, nor is it built for a future in which it wouldn't have access to, as I've said before, infinite resources.
  2125. &lt;/blockquote&gt;
  2126. Zitron uses Lehman Brothers as an analogy for the effects of a potential &lt;a href=&quot;https://www.wheresyoured.at/openai-is-a-systemic-risk-to-the-tech-industry-2/&quot;&gt;OpenAI failure&lt;/a&gt;:&lt;br /&gt;
  2127. &lt;blockquote&gt;
  2128. I can see OpenAI’s failure having a similar systemic effect. While there is a vast difference between OpenAI’s involvement in people’s lives compared to the millions of subprime loans issued to real people, the stock market’s dependence on the value of the Magnificent 7 stocks (Apple, Microsoft, Amazon, Alphabet, NVIDIA and Tesla), and in turn the Magnificent 7’s reliance on the stability of the AI boom narrative still threatens material harm to millions of people, and that’s before the ensuing layoffs.
  2129. &lt;/blockquote&gt;
  2130. Siddharth Venkataramakrishnan's &lt;a href=&quot;https://www.ft.com/content/24218775-57b1-4e9f-ba64-266a3239cf27&quot;&gt;&lt;i&gt;AI hype is drowning in slopaganda&lt;/i&gt;&lt;/a&gt; suggests that &quot;the AI boom narrative&quot; isn't self-sustaining:&lt;br /&gt;
  2131. &lt;blockquote&gt;
  2132. One hint that we might just be stuck in a hype cycle is the proliferation of what you might call “second-order slop” or “slopaganda”: a tidal wave of newsletters and X threads expressing awe at every press release and product announcement to hoover up some of that sweet, sweet advertising cash.&lt;br /&gt;
  2133. &lt;br /&gt;
  2134. That AI companies are actively patronising and fanning a cottage economy of self-described educators and influencers to bring in new customers suggests the emperor has no clothes (and six fingers).&lt;br /&gt;
  2135. &lt;br /&gt;
  2136. There are an awful lot of AI newsletters out there, but the two which kept appearing in my X ads were Superhuman AI run by Zain Kahn, and Rowan Cheung’s The Rundown. Both claim to have more than a million subscribers — an impressive figure, given the FT as of &lt;a href=&quot;https://pressgazette.co.uk/newsletters/publisher-newsletter-strategies-financial-times/&quot;&gt;February&lt;/a&gt; had 1.6mn subscribers across its &lt;a href=&quot;https://www.ft.com/newsletters&quot;&gt;newsletters&lt;/a&gt;.&lt;br /&gt;
  2137. &lt;br /&gt;
  2138. If you actually read the AI newsletters, it becomes harder to see why anyone’s staying signed up. They offer a simulacrum of tech reporting, with deeper insights or scepticism stripped out and replaced with techno-euphoria. Often they resemble the kind of press release summaries ChatGPT could have written.&lt;br /&gt;
  2139. &lt;br /&gt;
  2140. Yet AI companies apparently see enough upside to put money into these endeavours. In a 2023 interview, Zayn claimed that advertising spots on Superhuman pull in &lt;a href=&quot;https://www.businessinsider.com/make-7-figures-revenue-newsletter-artificial-intelligence-chatgpt-2023-6&quot;&gt;“six figures a month”&lt;/a&gt;. It currently costs $1,899 for a 150-character write-up as a featured tool in the newsletter.&lt;br /&gt;
  2141. ...&lt;br /&gt;
  2142. “These are basically content slop on the internet and adding very little upside on content value,” a data scientist at one of the Magnificent Seven told me. “It’s a new version of the Indian ‘news’ regurgitation portals which have gamified the SEO and SEM [search engine optimisation and marketing] playbook.”&lt;br /&gt;
  2143. &lt;br /&gt;
  2144. But newsletters are only the cream of the crop of slopaganda. X now teems with AI influencers willing to promote AI products for minimal sums (the lowest pricing I got was $40 a retweet). Most appear to be from Bangladesh or India, with a smattering of accounts claiming to be based in Australia or Europe. In apparent contravention of X’s paid partnerships policy, none disclose when they’re getting paid to promote content.&lt;br /&gt;
  2145. ...&lt;br /&gt;
  2146. In its own way, slopaganda exposes that the AI’s emblem is not the &lt;a href=&quot;https://www.nytimes.com/2023/05/30/technology/shoggoth-meme-ai.html&quot;&gt;Shoggoth&lt;/a&gt; but the Ouroboros. It’s a circle of AI firms, VCs backing those firms, talking shops made up of employees of those firms, and the long tail is the hangers-on, content creators, newsletter writers and ‘marketing experts’ willing to say anything for cash.
  2147. &lt;/blockquote&gt;
  2148. The AI bubble bursting would be a whole different and much quicker &quot;going out with a bang&quot;. How likely is it? To some extent OpenAI is just a front for Microsoft, which gets a slice of OpenAI's revenue, has access to OpenAI's technology, &quot;owns&quot; a slice of the &quot;non-profit&quot;, and provides almost all of OpenAI's compute at discounted prices. Microsoft, therefore, has perhaps the best view of the generative AI industry and its prospects.&lt;br /&gt;
  2149. &lt;br /&gt;
  2150. That makes David Gerard's &lt;a href=&quot;https://pivot-to-ai.com/2025/04/07/confirmed-microsoft-stops-new-data-centres-worldwide/&quot;&gt;&lt;i&gt;Confirmed: Microsoft stops new data centres worldwide&lt;/i&gt;&lt;/a&gt; very interesting:&lt;br /&gt;
  2151. &lt;blockquote&gt;
  2152. In February, stock analysts TD Cowen spotted that Microsoft had &lt;a href=&quot;https://pivot-to-ai.com/2025/02/24/microsoft-plays-down-ai-bubble-cancels-data-center-leases/&quot;&gt;cancelled leases for new data centres&lt;/a&gt; — 200 megawatts in the US, and one gigawatt of planned leases around the world.&lt;br /&gt;
  2153. &lt;br /&gt;
  2154. Microsoft denied everything. But TD Cowen kept investigating and found another two gigawatts of cancelled leases in the US and Europe. [&lt;a href=&quot;https://www.bloomberg.com/news/articles/2025-03-26/microsoft-abandons-more-data-center-projects-td-cowen-says&quot;&gt;Bloomberg&lt;/a&gt;, &lt;a href=&quot;https://archive.is/bofuy&quot;&gt;archive&lt;/a&gt;]&lt;br /&gt;
  2155. &lt;br /&gt;
  2156. Bloomberg has now confirmed that Microsoft has halted new data centres in Indonesia, the UK, Australia and the US. [&lt;a href=&quot;https://www.bloomberg.com/news/articles/2025-04-03/microsoft-pulls-back-on-data-centers-from-chicago-to-jakarta&quot;&gt;Bloomberg&lt;/a&gt;, &lt;a href=&quot;https://archive.is/bofuy&quot;&gt;archive&lt;/a&gt;]&lt;br /&gt;
  2157. &lt;br /&gt;
  2158. The Cambridge, UK site was specifically designed to host Nvidia GPU clusters. Microsoft also pulled out of the new Docklands Data Centre in Canary Wharf, London.&lt;br /&gt;
  2159. &lt;br /&gt;
  2160. In Wisconsin, US, Microsoft had already spent $262 million on construction — but then just pulled the plug.&lt;br /&gt;
  2161. &lt;br /&gt;
  2162. &lt;a href=&quot;https://pivot-to-ai.com/2024/07/01/microsoft-you-wouldnt-download-a-car-thats-our-job/&quot;&gt;Mustafa Suleyman&lt;/a&gt; of Microsoft told CNBC that instead of being “the absolute frontier,” Microsoft now prefers AI models that are “three to six months behind.” [&lt;a href=&quot;https://www.cnbc.com/2025/04/04/microsoft-ai-chief-sees-benefits-to-ai-models-that-are-months-behind.html&quot;&gt;CNBC&lt;/a&gt;]&lt;br /&gt;
  2163. &lt;br /&gt;
  2164. Google has taken up some of Microsoft’s abandoned deals in Europe. OpenAI took over Microsoft’s contract with CoreWeave. [&lt;a href=&quot;https://www.reuters.com/technology/artificial-intelligence/coreweave-strikes-12-billion-contract-with-openai-ahead-ipo-sources-say-2025-03-10/&quot;&gt;Reuters&lt;/a&gt;]
  2165. &lt;/blockquote&gt;
  2166. Ed Zitron covered this &quot;pullback&quot; a month ago in &lt;a href=&quot;https://www.wheresyoured.at/power-cut/&quot;&gt;&lt;i&gt;Power Cut&lt;/i&gt;&lt;/a&gt;:&lt;br /&gt;
  2167. &lt;blockquote&gt;
  2168. As a result, based on TD Cowen's analysis, Microsoft has, through a combination of canceled leases, pullbacks on Statements of Qualifications, cancellations of land parcels and deliberate expiration of Letters of Intent, effectively abandoned data center expansion equivalent to over 14% of its current capacity.&lt;br /&gt;
  2169. ...&lt;br /&gt;
  2170. In plain English, Microsoft, which arguably has more data than anybody else about the health of the generative AI industry and its potential for growth, has decided that it needs to dramatically slow down its expansion. Expansion which, to hammer the point home, is absolutely necessary for generative AI to continue evolving and expanding.
  2171. &lt;/blockquote&gt;
  2172. Microsoft may have got the message, but &lt;a href=&quot;https://www.wheresyoured.at/power-cut/&quot;&gt;T.D. Cowen reported&lt;/a&gt; others haven't:&lt;br /&gt;
  2173. &lt;blockquote&gt;
  2174. While there is a pullback in Microsoft's data center leasing, it’s seen a &quot;commensurate rise in demand from Oracle related to &lt;a href=&quot;https://openai.com/index/announcing-the-stargate-project/&quot;&gt;The Stargate Project&lt;/a&gt;&quot; — a relatively new partnership of &quot;up to $500 billion&quot; to build massive new data centers for AI, led by SoftBank, and OpenAI, with investment from Oracle and MGX, a &lt;a href=&quot;https://www.theinformation.com/articles/openai-in-talks-with-emirati-fund-mgx-for-7-billion-fundraise&quot;&gt;$100 billion investment fund backed by the United Arab Emirates&lt;/a&gt;.
  2175. &lt;/blockquote&gt;
  2176. The data centers will get built, and they will consume power, because even if AI never manages to turn a profit, some other bubble will take its place to use all the idle GPUs. Despite all the rhetoric about renewables and small modular reactors, much of the additional power will come from fossil fuels.&lt;br /&gt;
  2177. &lt;br /&gt;
  2178. Data center carbon emissions don't just come from power (Scope 1 and 2). In 2023 Jialin Lyu &lt;i&gt;et al&lt;/i&gt; from Microsoft published &lt;a href=&quot;https://doi.org/10.1145/3604930.3605717&quot;&gt;&lt;i&gt;Myths and Misconceptions Around Reducing Carbon Embedded in Cloud Platforms&lt;/i&gt;&lt;/a&gt; and stressed the importance of embedded carbon (Scope 3) in the total environmental impact of data centers:&lt;br /&gt;
  2179. &lt;blockquote&gt;
  2180. For example, 66% of the electricity used at Google datacenters was matched with renewable energy on an hourly basis in 2021. With historic growth rates, this is likely closer to 70% today. Our LCAs indicate that with 70-75% renewable energy, Scope 3 accounts for close to half of datacenter carbon emissions. Therefore, Scope 3 emissions and embodied carbon are important factors both currently and in the near future.
  2181. &lt;/blockquote&gt;
  2182. Microsoft is commendably frank about Scope 3 emissions. A year ago Dan Robinson reported that &lt;a href=&quot;https://www.theregister.com/2024/05/16/microsoft_co2_emissions/&quot;&gt;&lt;i&gt;
  2183. Microsoft's carbon emissions up nearly 30% thanks to AI
  2184. &lt;/i&gt;&lt;/a&gt;:&lt;br /&gt;
  2185. &lt;blockquote&gt;
  2186. The Redmond IT giant says that its CO2 emissions are up 29.1 percent from the 2020 baseline, and this is largely due to indirect emissions (Scope 3) from the construction and provisioning of more datacenters to meet customer demand for cloud services.&lt;br /&gt;
  2187. &lt;br /&gt;
  2188. These figures come from Microsoft's &lt;a href=&quot;https://query.prod.cms.rt.microsoft.com/cms/api/am/binary/RW1lhhu&quot;&gt;2024 Environmental Sustainability Report&lt;/a&gt; [PDF], which covers the corp's FY2023 ended June 30, 2023. This encompasses a period when Microsoft started ramping up AI support following the explosion of interest in OpenAI and ChatGPT.
  2189. &lt;/blockquote&gt;
  2190. Microsoft's &quot;pullback&quot; will likely reduce their Scope 3 emissions going forward, but I would expect that their recent build-out will have reduced the proportion of renewables being consumed. If the Stargate build-out goes ahead it will cause enormous Scope 3 emissions.&lt;br /&gt;
  2191. &lt;br /&gt;
  2192. &lt;table cellpadding=&quot;0&quot; cellspacing=&quot;0&quot; class=&quot;tr-caption-container&quot; style=&quot;float: right;&quot;&gt;&lt;tbody&gt;&lt;tr&gt;&lt;td style=&quot;text-align: center;&quot;&gt;&lt;a href=&quot;https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgpoVBKUX2lh30QlhSZmLZGrXgq7DdD9MU3UGP_k5t0duHtPOwEDNrxw4VH3f5UUIlH0tar_V6qmfHZyjXQNKdao-hEBskpkE9ttftb9Ux_gZNjRWXFDpwE9FbHRQ-1RQh3sBHaHvPVWmhBV2DNxhB5wXNPHLlNGP2d7bP3kktwRxBK16wP7Q55slf7JD__/s1748/SeagateEmbodiedCarbon.png&quot; style=&quot;clear: right; margin-bottom: 1em; margin-left: auto; margin-right: auto;&quot;&gt;&lt;img border=&quot;0&quot; height=&quot;124&quot; src=&quot;https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgpoVBKUX2lh30QlhSZmLZGrXgq7DdD9MU3UGP_k5t0duHtPOwEDNrxw4VH3f5UUIlH0tar_V6qmfHZyjXQNKdao-hEBskpkE9ttftb9Ux_gZNjRWXFDpwE9FbHRQ-1RQh3sBHaHvPVWmhBV2DNxhB5wXNPHLlNGP2d7bP3kktwRxBK16wP7Q55slf7JD__/w200-h124/SeagateEmbodiedCarbon.png&quot; width=&quot;200&quot; /&gt;&lt;/a&gt;&lt;/td&gt;&lt;/tr&gt;&lt;tr&gt;&lt;td class=&quot;tr-caption&quot; style=&quot;text-align: center;&quot;&gt;&lt;a href=&quot;https://www.seagate.com/content/dam/seagate/assets/resources/decarbonizing-data-report/decarbonizing-data-report-040325.pdf&quot;&gt;Source&lt;/a&gt;&lt;/td&gt;&lt;/tr&gt;&lt;/tbody&gt;&lt;/table&gt;
  2193. Here is one final note of gloom. Training AI models requires rapid access to large amounts of data, motivating data centers to use SSDs instead of hard drives. Counterintuitively, research Seagate published in &lt;a href=&quot;https://www.seagate.com/content/dam/seagate/assets/resources/decarbonizing-data-report/decarbonizing-data-report-040325.pdf&quot;&gt;&lt;i&gt;Decarbonizing Data&lt;/i&gt;&lt;/a&gt; shows that, despite their smaller size, SSDs have much higher embedded carbon emissions than hard drives. 30TB of SSD has over 160 times as much embedded carbon as a 30TB hard drive.&lt;br /&gt;
  2194. &lt;br /&gt;
  2195. &lt;table cellpadding=&quot;0&quot; cellspacing=&quot;0&quot; class=&quot;tr-caption-container&quot; style=&quot;float: right; margin-left: 1em; text-align: right;&quot;&gt;&lt;tbody&gt;&lt;tr&gt;&lt;td style=&quot;text-align: center;&quot;&gt;&lt;a href=&quot;https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhmmh5eOzcEsocAy9yLgzeNgNRaRWWEH-lD2ZHj7ROA381HlEhfjRI0izGtyC0rjKThWyoOQzc9gqYGCr7lsMhzV3-YTzfCxblq8Jwf7Por-ygQeUG3ukMlS8LXKgEQ-PJa3llwKjP2KTTk1U89rAWZofrmNFBFAGEA4Kpx6Go1XIPKoOK8zt7O1ZFGHluj/s1760/SeagatePower.png&quot; style=&quot;clear: left; margin-bottom: 1em; margin-left: auto; margin-right: auto;&quot;&gt;&lt;img border=&quot;0&quot; height=&quot;101&quot; src=&quot;https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhmmh5eOzcEsocAy9yLgzeNgNRaRWWEH-lD2ZHj7ROA381HlEhfjRI0izGtyC0rjKThWyoOQzc9gqYGCr7lsMhzV3-YTzfCxblq8Jwf7Por-ygQeUG3ukMlS8LXKgEQ-PJa3llwKjP2KTTk1U89rAWZofrmNFBFAGEA4Kpx6Go1XIPKoOK8zt7O1ZFGHluj/w200-h101/SeagatePower.png&quot; width=&quot;200&quot; /&gt;&lt;/a&gt;&lt;/td&gt;&lt;/tr&gt;&lt;tr&gt;&lt;td class=&quot;tr-caption&quot; style=&quot;text-align: center;&quot;&gt;&lt;a href=&quot;https://www.seagate.com/content/dam/seagate/assets/resources/decarbonizing-data-report/decarbonizing-data-report-040325.pdf&quot;&gt;Source&lt;/a&gt;&lt;/td&gt;&lt;/tr&gt;&lt;/tbody&gt;&lt;/table&gt;
  2196. Even more of a suprise, Seagate's research shows that SSDs even have higher operational emissions than hard drives. While actively reading or writing data, a 30TB SSD uses twice as much power as a 30TB hard drive.&lt;br /&gt;
  2197. &lt;br /&gt;
  2198. The legacy of Balthus' clients attitude that they 'know the world is doomed, so may as well go out with a bang' and the unsustainable AI bubble will be a massive overbuild of data centers, most of which will be incapable of hosting Nvidia's top-of-the-line racks. If the current cryptocurrency-friendly administration succeeds in pumping Bitcoin back these data centers will likely revert to mining. Either way, the Scope 3 emissions from building and equipping and the Scope 1 and 2 emissions from powering them with natural gas and coal, will put megatons of CO2 into the atmosphere, hastening the point where it is unlikely that 'us in the North, we're fine'.&lt;br /&gt;
  2199. &lt;br /&gt;
  2200. &lt;pre&gt;&lt;/pre&gt;</content>
  2201.      <author>
  2202.  <name>David. (noreply@blogger.com)</name>
  2203. <uri>https://blog.dshr.org/</uri>
  2204.      </author>
  2205.    </entry>
  2206.  
  2207.    <entry>
  2208.      <title>Tara Robertson: A deck of powerful questions</title>
  2209.      <link rel="alternate" type="text/html" href="https://tararobertson.ca/2025/a-deck-of-powerful-questions/"/>
  2210.      <id>https://tararobertson.ca/?p=6957</id>
  2211.      <updated>2025-06-16T14:06:41+00:00</updated>
  2212.      <content type="html">&lt;p&gt;&lt;a href=&quot;https://tararobertson.ca/wp-content/uploads/2025/06/IMG_6596-scaled.jpg&quot;&gt;&lt;img alt=&quot;me holding a card that says &amp;quot;what do you have control over?&amp;quot;&quot; class=&quot;aligncenter wp-image-6958 size-large&quot; height=&quot;1024&quot; src=&quot;https://tararobertson.ca/wp-content/uploads/2025/06/IMG_6596-768x1024.jpg&quot; width=&quot;768&quot; /&gt;&lt;/a&gt;&lt;/p&gt;
  2213. &lt;p&gt;One of my work highlights this year has been co-designing a series of five workshops to teach coaching skills to open scientists. Open scientists from NASA, NOAA Fisheries, The California Environmental Protection Agency, Fred Hutch Cancer Center participated in these workshops organized by &lt;a href=&quot;https://openscapes.org/&quot;&gt;Openscapes&lt;/a&gt;. &lt;/p&gt;
  2214. &lt;p&gt;For the knowledge transfer part of the project, we made &lt;a href=&quot;https://github.com/Openscapes/powerful-questions-card-deck&quot;&gt;a card deck of powerful questions for open scientists&lt;/a&gt;. The artwork on the back is from &lt;a class=&quot;ember-view&quot; href=&quot;https://allisonhorst.com/&quot; id=&quot;ember952&quot;&gt;Allison Horst&lt;/a&gt;. These are open licensed. &lt;/p&gt;
  2215. &lt;p&gt;Thank you to &lt;a href=&quot;https://www.tinamariemeyer.com/&quot;&gt;Tina-Marie Meyer&lt;/a&gt; and &lt;a href=&quot;https://www.alvinpilobello.com/&quot;&gt;Alvin Pilobello&lt;/a&gt; for your partnership and thank you &lt;a href=&quot;https://jules32.github.io/&quot;&gt;Dr. Julia Stewart Lowndes&lt;/a&gt; for hiring us to do this important work.&lt;/p&gt;
  2216. &lt;p&gt;The post &lt;a href=&quot;https://tararobertson.ca/2025/a-deck-of-powerful-questions/&quot;&gt;A deck of powerful questions&lt;/a&gt; appeared first on &lt;a href=&quot;https://tararobertson.ca&quot;&gt;Tara Robertson Consulting&lt;/a&gt;.&lt;/p&gt;</content>
  2217.      <author>
  2218.  <name>Tara Robertson</name>
  2219. <uri>https://tararobertson.ca/</uri>
  2220.      </author>
  2221.    </entry>
  2222.  
  2223.    <entry>
  2224.      <title>Journal of Web Librarianship: Inclusive cataloging: Histories, context, and reparative approaches</title>
  2225.      <link rel="alternate" type="text/html" href="https://www.tandfonline.com/doi/full/10.1080/19322909.2025.2515358?ai=1dl&amp;mi=co84bk&amp;af=R"/>
  2226.      <id>/doi/full/10.1080/19322909.2025.2515358?ai=1dl&amp;mi=co84bk&amp;af=R</id>
  2227.      <updated>2025-06-16T10:23:00+00:00</updated>
  2228.      <content type="html">. &lt;br /&gt;</content>
  2229.      <author>
  2230.  <name>Ellen Corrigan Eastern Illinois University, Charleston, IL, USA</name>
  2231. <uri>https://www.tandfonline.com/loi/wjwl20?ai=1dl&mi=co84bk&af=R</uri>
  2232.      </author>
  2233.    </entry>
  2234.  
  2235.    <entry>
  2236.      <title>Information Technology and Libraries: Letter from the Editors</title>
  2237.      <link rel="alternate" type="text/html" href="https://ital.corejournals.org/index.php/ital/article/view/17472"/>
  2238.      <id>https://ital.corejournals.org/index.php/ital/article/view/17472</id>
  2239.      <updated>2025-06-16T07:00:00+00:00</updated>
  2240.      <content type="html"></content>
  2241.      <author>
  2242.  <name>Kenneth J. Varnum; Marisha C. Kelly</name>
  2243. <uri>https://ital.corejournals.org/index.php/ital</uri>
  2244.      </author>
  2245.    </entry>
  2246.  
  2247.    <entry>
  2248.      <title>Information Technology and Libraries: Learning, Listening, and Leading</title>
  2249.      <link rel="alternate" type="text/html" href="https://ital.corejournals.org/index.php/ital/article/view/17455"/>
  2250.      <id>https://ital.corejournals.org/index.php/ital/article/view/17455</id>
  2251.      <updated>2025-06-16T07:00:00+00:00</updated>
  2252.      <content type="html">&lt;p&gt;Starting a systems librarian role after a prolonged vacancy presents a distinct set of challenges and opportunities. This reflective column explores the experience of navigating the first year in such a position, emphasizing the importance of learning through action, relationship-building, and adaptive leadership. From mastering core systems to establishing connections across campus IT and internal library departments, the column highlights strategies for building credibility and trust while supporting critical library services. It also examines the transition from technical contributor to departmental leader, illustrating how systems librarians increasingly engage in strategic planning, policy development, and organizational change. This column offers practical insights for new systems librarians stepping into evolving roles at the intersection of technology, user experience, and institutional priorities.&lt;/p&gt;</content>
  2253.      <author>
  2254.  <name>Amber Wu</name>
  2255. <uri>https://ital.corejournals.org/index.php/ital</uri>
  2256.      </author>
  2257.    </entry>
  2258.  
  2259.    <entry>
  2260.      <title>Information Technology and Libraries: Starting up a Digital Preservation (Pilot) Program</title>
  2261.      <link rel="alternate" type="text/html" href="https://ital.corejournals.org/index.php/ital/article/view/17452"/>
  2262.      <id>https://ital.corejournals.org/index.php/ital/article/view/17452</id>
  2263.      <updated>2025-06-16T07:00:00+00:00</updated>
  2264.      <content type="html">&lt;p&gt;In 2024, Hamilton College started a project to pilot Archivematica, a digital preservation platform. The author details leading the project as the relatively new Digital Curation and Preservation Librarian and suggests ways that readers could approach such a project, incorporating what she learned along the way.&lt;/p&gt;</content>
  2265.      <author>
  2266.  <name>Kimberly Hoffman</name>
  2267. <uri>https://ital.corejournals.org/index.php/ital</uri>
  2268.      </author>
  2269.    </entry>
  2270.  
  2271.    <entry>
  2272.      <title>Information Technology and Libraries: Ready Year One</title>
  2273.      <link rel="alternate" type="text/html" href="https://ital.corejournals.org/index.php/ital/article/view/17424"/>
  2274.      <id>https://ital.corejournals.org/index.php/ital/article/view/17424</id>
  2275.      <updated>2025-06-16T07:00:00+00:00</updated>
  2276.      <content type="html">&lt;p&gt;This reflective piece chronicles the first year of establishing a Virtual Reality (VR) program at the Paul Sawyier Public Library in Frankfort, Kentucky. Beginning with limited VR experience and a modest budget, the author navigated the challenges of launching the initiative—from equipment procurement and account management to program development and community engagement. The library introduced VR experiences aimed at enhancing digital literacy, education, and recreation for diverse age groups, achieving notable success with programs such as &lt;em&gt;Anne Frank House VR&lt;/em&gt; and &lt;em&gt;Maestro VR&lt;/em&gt;. Lessons learned included the importance of thorough preparation, selecting age-appropriate content, embracing community feedback, and adapting programming to audience needs. Despite early obstacles, the VR initiative attracted over 50 participants in its first year, demonstrating the community’s readiness for emerging technologies. Building on this momentum, the library plans to expand its VR offerings, cementing virtual reality as an innovative and inclusive tool for lifelong learning and engagement in a rural setting.&lt;/p&gt;</content>
  2277.      <author>
  2278.  <name>Dalton Bennett</name>
  2279. <uri>https://ital.corejournals.org/index.php/ital</uri>
  2280.      </author>
  2281.    </entry>
  2282.  
  2283.    <entry>
  2284.      <title>Information Technology and Libraries: Prospects of Retrieval Augmented Generation (RAG) for Academic Library Search and Retrieval</title>
  2285.      <link rel="alternate" type="text/html" href="https://ital.corejournals.org/index.php/ital/article/view/17361"/>
  2286.      <id>https://ital.corejournals.org/index.php/ital/article/view/17361</id>
  2287.      <updated>2025-06-16T07:00:00+00:00</updated>
  2288.      <content type="html">&lt;div&gt; &lt;p class=&quot;AbstractText&quot;&gt;This paper examines the integration of retrieval-augmented generation (RAG) systems within academic library environments, focusing on their potential to transform traditional search and retrieval mechanisms. RAG combines the natural language understanding capabilities of large language models with structured retrieval from verified knowledge bases, offering a novel approach to academic information discovery. The study analyzes the technical requirements for implementing RAG in library systems, including embedding pipelines, vector databases, and middleware architecture for integration with existing library infrastructure. We explore how RAG systems can enhance search precision through semantic indexing, real-time query processing, and contextual understanding while maintaining compliance with data privacy and copyright regulations. The research highlights RAG’s ability to improve user experience through personalized research assistance, conversational interfaces, and multimodal content integration. Critical considerations including ethical implications, copyright compliance, and system transparency are addressed. Our findings indicate that while RAG presents significant opportunities for advancing academic library services, successful implementation requires careful attention to technical architecture, data protection, and user trust. The study concludes that RAG integration holds promise for revolutionizing academic library services while emphasizing the need for continued research in areas of scalability, ethical compliance, and cost-effective implementation.&lt;/p&gt; &lt;/div&gt;</content>
  2289.      <author>
  2290.  <name>Ravi Varma Kumar Bevara, Brady D. Lund, Nishith Reddy Mannuru, Sai Pranathi Karedla, Yara Mohammed, Sai Tulasi Kolapudi, Aashrith Mannuru</name>
  2291. <uri>https://ital.corejournals.org/index.php/ital</uri>
  2292.      </author>
  2293.    </entry>
  2294.  
  2295.    <entry>
  2296.      <title>Information Technology and Libraries: Web Archives Metadata Generation with GPT-4o</title>
  2297.      <link rel="alternate" type="text/html" href="https://ital.corejournals.org/index.php/ital/article/view/17305"/>
  2298.      <id>https://ital.corejournals.org/index.php/ital/article/view/17305</id>
  2299.      <updated>2025-06-16T07:00:00+00:00</updated>
  2300.      <content type="html">&lt;div&gt; &lt;p class=&quot;AbstractText&quot;&gt;Current metadata creation for web archives is time consuming and costly due to reliance on human effort. This paper explores the use of GPT-4o for metadata generation within the Web Archive Singapore, focusing on scalability, efficiency, and cost effectiveness. We processed 112 Web ARChive (WARC) files using data reduction techniques, achieving a notable 99.9% reduction in metadata generation costs. By prompt engineering, we generated titles and abstracts, which were evaluated both intrinsically using Levenshtein distance and BERTScore, and extrinsically with human cataloguers using McNemar’s test. Results indicate that while our method offers significant cost savings and efficiency gains, human curated metadata maintains an edge in quality. The study identifies key challenges including content inaccuracies, hallucinations, and translation issues, suggesting that large language models (LLMs) should serve as complements rather than replacements for human cataloguers. Future work will focus on refining prompts, improving content filtering, and addressing privacy concerns through experimentation with smaller models. This research advances the integration of LLMs in web archiving, offering valuable insights into their current capabilities and outlining directions for future enhancements. The code is available at &lt;a href=&quot;https://github.com/masamune-prog/warc2summary&quot;&gt;https://github.com/masamune-prog/warc2summary&lt;/a&gt; for further development and use by institutions facing similar challenges.&lt;/p&gt; &lt;/div&gt;</content>
  2301.      <author>
  2302.  <name>Ashwin Nair, Zhen Rong Goh, Tianrui Liu, Abigail Yongping Huang</name>
  2303. <uri>https://ital.corejournals.org/index.php/ital</uri>
  2304.      </author>
  2305.    </entry>
  2306.  
  2307.    <entry>
  2308.      <title>Information Technology and Libraries: Weaving the Threads of Bibliographic Ontologies</title>
  2309.      <link rel="alternate" type="text/html" href="https://ital.corejournals.org/index.php/ital/article/view/17289"/>
  2310.      <id>https://ital.corejournals.org/index.php/ital/article/view/17289</id>
  2311.      <updated>2025-06-16T07:00:00+00:00</updated>
  2312.      <content type="html">&lt;div&gt; &lt;p class=&quot;AbstractText&quot;&gt;&lt;span lang=&quot;EN-GB&quot;&gt;Bibliographic ontologies are crucial to make the most of networked library metadata, but they show interoperability limitations in the Semantic Web. Following a research study on the subject, this paper presents a possible solution to such limitations by means of a reference ontology (RO) intended to allow integration of different ontologies without imposing a common central one and to overcome limitations of mapping techniques, such as crosswalks and application profiles, most used in interconnecting bibliographic ontologies. Interoperability issues of Resource Description and Access (RDA) and Bibliographic Framework Initiative—BIBFRAME (BF) ontologies are addressed using real-world examples from the Library of Congress (LoC) and Biblioteca Nacional de España (BNE) datasets. For a proof of concept of the RO, this paper is focused on two specific interoperability problems that are not solvable with the usual data transformative techniques: &lt;a name=&quot;_Hlk169004178&quot;&gt;&lt;/a&gt;misalignments concerning the definition and representation of Work and Expression classes; and the absence of formalization of properties essential to whole-part relationships, namely transitivity, nonreflexivity and asymmetry. The potential of the RO for solving such problem examples is demonstrated by making in-depth use of Resource Description Framework Schema/Web Ontology Language (RDFS/OWL) semantic reasoning and inference mechanisms, combined with Shapes Constraint Language (SHACL), when restrictions are needed to impose data constraints and validation. The RO innovation consists in the formulation of an independent high-level ontology, through which the elements of different source-ontologies are interlinked without being modified or replaced, but rather preserved, and in using semantic mechanisms to generate additional elements needed to consistently describe the relationship between them.&lt;/span&gt;&lt;/p&gt; &lt;/div&gt;</content>
  2313.      <author>
  2314.  <name>Helena Patrício, Pedro Nogueira Ramos, Maria Inês Cordeiro</name>
  2315. <uri>https://ital.corejournals.org/index.php/ital</uri>
  2316.      </author>
  2317.    </entry>
  2318.  
  2319.    <entry>
  2320.      <title>Information Technology and Libraries: Irrelevant Discovery Layers?</title>
  2321.      <link rel="alternate" type="text/html" href="https://ital.corejournals.org/index.php/ital/article/view/17266"/>
  2322.      <id>https://ital.corejournals.org/index.php/ital/article/view/17266</id>
  2323.      <updated>2025-06-16T07:00:00+00:00</updated>
  2324.      <content type="html">&lt;div&gt; &lt;p class=&quot;AbstractText&quot;&gt;Over the past fifteen years, most academic librarians have implemented one-stop search tools, commonly referred to as discovery layers, to accommodate contemporary user expectations. In more recent years these tools have come under criticism due to their limitations and shortcomings. We set out to evaluate if a discovery layer, when prompted with typical user keyword search strings, produced the most relevant search results when compared with two other widely accessible academic search tools. We compared search results from a discovery layer with a central index (WorldCat Discovery) with search results from a subscription interdisciplinary index and abstract database (Academic Search Complete) and a freely accessible academic web search engine (Google Scholar). We created a rubric detailed enough for multiple evaluators, who were the authors, to judge search results for currency, relevancy, proximity, and authority, as well as to assign appropriate penalties. Academic Search Complete search results received the highest overall scores, while WorldCat Discovery search results received the lowest overall scores. When considering individual pieces of the rubric, Academic Search Complete provided the most current and authoritative sources, while Google Scholar provided the most relevant sources. This article provides recommendations for libraries moving forward to consider the benefits and costs of discovery tools.&lt;/p&gt; &lt;/div&gt;</content>
  2325.      <author>
  2326.  <name>Ruth Szpunar, Eric Bradley, Erin Gabrielson, Catherine Pellegrino</name>
  2327. <uri>https://ital.corejournals.org/index.php/ital</uri>
  2328.      </author>
  2329.    </entry>
  2330.  
  2331.    <entry>
  2332.      <title>Information Technology and Libraries: Simplifying and Enhancing Access to Full-Text Articles with LibKey Discovery</title>
  2333.      <link rel="alternate" type="text/html" href="https://ital.corejournals.org/index.php/ital/article/view/17264"/>
  2334.      <id>https://ital.corejournals.org/index.php/ital/article/view/17264</id>
  2335.      <updated>2025-06-16T07:00:00+00:00</updated>
  2336.      <content type="html">&lt;div&gt; &lt;p class=&quot;AbstractText&quot;&gt;Content delivery mechanisms in web-scale discovery services can often fall short of patron expectations. The Texas Tech University Libraries integrated Third Iron’s LibKey Discovery with Ex Libris Primo in response to patron feedback that accessing full-text articles from Primo required too many clicks. Following implementation, researchers conducted user testing to assess the usability and usefulness of the integration. They also analyzed LibKey Discovery and link resolver usage data across the year following the launch to evaluate its impact on how patrons accessed full-text articles. Patrons responded positively overall to LibKey Discovery, perceiving it as more efficient and quicker to use. Statistics showed an increase in LibKey link usage and a corresponding decrease in link resolver clickthroughs. This article provides an overview of the implementation process, describes assessment methodologies and findings, and discusses implications for improving usability and increasing usage.&lt;/p&gt; &lt;/div&gt;</content>
  2337.      <author>
  2338.  <name>Jessie London, Barbara McArthur, Kimberly Vardeman, Cynthia L. Henry</name>
  2339. <uri>https://ital.corejournals.org/index.php/ital</uri>
  2340.      </author>
  2341.    </entry>
  2342.  
  2343.    <entry>
  2344.      <title>Information Technology and Libraries: Leveraging Innovative Technologies for Improved Library Practices in the Digital Era</title>
  2345.      <link rel="alternate" type="text/html" href="https://ital.corejournals.org/index.php/ital/article/view/16767"/>
  2346.      <id>https://ital.corejournals.org/index.php/ital/article/view/16767</id>
  2347.      <updated>2025-06-16T07:00:00+00:00</updated>
  2348.      <content type="html">&lt;div&gt; &lt;p class=&quot;AbstractText&quot;&gt;Lack of expertise, among other reasons, has been cited as the reason for Nigerian libraries’ sluggish adoption and application of innovative technology. To solve this problem, a thorough analysis of the level of technology implementation in Nigerian libraries is needed. This work closes the gap in the literature by using the PRISMA search method to examine 42 pertinent works published between 2016 and 2023. A questionnaire was used to collect data from 240 respondents using the survey approach. Six federal university libraries were selected by a stratified random sampling procedure, one from each of Nigeria’s six geopolitical zones. Then, 40 library employees from each of the selected libraries were selected using a purposive sample technique. Mean and standard deviation were used to analyze the responses. The results demonstrate a grand mean of 2.44 (SD=1.15), indicating low staff awareness of the potential benefits of innovative technology for library operations. A negative grand mean of 2.42 on the degree of usage was also noted, indicating low use of cutting-edge technology in the research area. A lack of internet penetration, a deficiency of digital literacy, insufficient funds, and unstable power sources were among the problems found limiting Nigerian libraries from taking advantage of advanced technologies. In light of this, the study suggests that the Nigerian government should increase budget allocations for libraries and enhance policies that support the self-development of library staff. Additionally, libraries in developing countries should seek mentorship from those in developed nations to update their knowledge and skills, enabling better implementation and deployment of cutting-edge technology in their libraries.&lt;/p&gt; &lt;/div&gt;</content>
  2349.      <author>
  2350.  <name>Rebecca Chidimma Ojobor, Prof. Victoria N. Okafor, Dr. Richard N Ugwuanyi</name>
  2351. <uri>https://ital.corejournals.org/index.php/ital</uri>
  2352.      </author>
  2353.    </entry>
  2354.  
  2355.    <entry>
  2356.      <title>Distant Reader Blog: Network Graphs Are Fun</title>
  2357.      <link rel="alternate" type="text/html" href="https://distantreader.org/blog/network-graphs-are-fun/"/>
  2358.      <id>https://distantreader.org/blog/network-graphs-are-fun/</id>
  2359.      <updated>2025-06-16T04:00:00+00:00</updated>
  2360.      <content type="html">&lt;div&gt;
  2361.  &lt;p&gt;I frequently create network graphs for the purposes of analyzing and visualizing corpora of narrative text. Network graphs, at their simplest, are composed of only two things, nodes and edges, where nodes are things and edges are relationships between those things. This missive outlines a few ways I use network graphs to do distant reading against corpora.&lt;/p&gt;
  2362.  &lt;h2&gt;Modeling bigrams&lt;/h2&gt;
  2363.  &lt;p&gt;In my world, creating network graphs can be as simple as selectively modeling a corpus's bigrams. For example, in the Iliad and the Odyssey, I might want to know, &quot;What are some of the things shared between Ulysses, Achilles, Hector, Penelope, and Telemachus? What do they all have on common, how are they distinctive, and to what degree are they related to each other?&quot; To actually create the graph, one can find all bigrams, loop through each, retain only the bigrams whose first word is one of the selected characters, and save the result as a file. In my case, an abbreviated version of node combinations -- the graph -- looks like this:&lt;/p&gt;
  2364.  &lt;ul&gt;
  2365.    &lt;li&gt;hector drew&lt;/li&gt;
  2366.    &lt;li&gt;achilles cut&lt;/li&gt;
  2367.    &lt;li&gt;achilles prays&lt;/li&gt;
  2368.    &lt;li&gt;hector tore&lt;/li&gt;
  2369.    &lt;li&gt;telemachus gave&lt;/li&gt;
  2370.    &lt;li&gt;penelope he&lt;/li&gt;
  2371.    &lt;li&gt;ulysses leading&lt;/li&gt;
  2372.    &lt;li&gt;hector saw&lt;/li&gt;
  2373.    &lt;li&gt;ulysses requite&lt;/li&gt;
  2374.    &lt;li&gt;achilles cleaver&lt;/li&gt;
  2375.    &lt;li&gt;ulysses replied&lt;/li&gt;
  2376.    &lt;li&gt;hector horses&lt;/li&gt;
  2377.  &lt;/ul&gt;
  2378.  &lt;p&gt;Such is called an edges table, and while the type of edges is not explicitly articulated, the simple pairing of nodes denotes some sort of relationship.&lt;/p&gt;
  2379.  &lt;p&gt;Once an edges table is created, it can be imported into any number of desktop applications or programming languages for analysis and visualization. Personally, I use an application called &lt;a href=&quot;https://gephi.org&quot;&gt;Gephi&lt;/a&gt; and/or a Python library called &lt;a href=&quot;https://networkx.org&quot;&gt;networkx&lt;/a&gt; for these purposes.&lt;/p&gt;
  2380.  &lt;p&gt;Network graphs have all sorts of different properties. For example, a graph can be directional or not, meaning if the the edges emanate from one node to another, then the graph is &quot;directional&quot;. If not, then the graph is &quot;bi-directional&quot;. Second, a node may have many edges, and the number of edges is called the node's &quot;degree&quot;. If the graph is directional, then nodes have some number of &quot;in-degrees&quot; and some number of &quot;out-degrees&quot;. Other properties include but are certainly not limited to: &quot;betweenness&quot;, &quot;closeness&quot;, and &quot;diameter&quot;. Moreover, one might apply any number of clustering techniques to a graph to determine the size and scope of &quot;neighborhoods&quot;. After taking a graph's properties into account, graphs can be visualized. For example, the color and sizes of things can be assigned, and different layouts can be employed. When it comes to layouts, force-directed layouts are the most popular.&lt;/p&gt;
  2381.  &lt;p&gt;After importing my graph into Gephi, taking a number of different properties into account, I generated the following visualization:&lt;/p&gt;
  2382.  &lt;p style=&quot;text-align: center;&quot;&gt;
  2383.    &lt;img alt=&quot;./bigrams-small.png&quot; src=&quot;https://distantreader.org/blog/network-graphs-are-fun/bigrams-small.png&quot; /&gt;
  2384.    &lt;br /&gt;
  2385.    &lt;a href=&quot;https://distantreader.org/blog/network-graphs-are-fun/bigrams.png&quot;&gt;Network graph of selected characters from the Iliad and the Odyssey&lt;/a&gt;
  2386.  &lt;/p&gt;
  2387.  &lt;p&gt;I can now address my questions:&lt;/p&gt;
  2388.  &lt;ul&gt;
  2389.    &lt;li&gt;What are some of the things shared between Ulysses, Achilles, Hector, Penelope, and Telemachus? Well, for example, both Achilles and Hector are associated with &quot;slay&quot;, &quot;slew&quot;, &quot;shield&quot;, and &quot;burn&quot;. While only Penelope and Telemachus are associated with &quot;wept&quot;, and &quot;sends&quot;.&lt;/li&gt;
  2390.    &lt;li&gt;What do they all have on common? All the characters are associated with &quot;he&quot;, &quot;took&quot;, &quot;smiled&quot;, &quot;lay&quot;, and to the greatest degree, &quot;answered&quot;. (Apparently Ulysses does a lot of answering!)&lt;/li&gt;
  2391.    &lt;li&gt;How are they distinctive? Only Penelope is associated with &quot;daughter&quot;, only Telemachus is associated with &quot;lost&quot;, etc.&lt;/li&gt;
  2392.    &lt;li&gt;To what degree are they related to each other? Because of the layout, I assert Achilles and Telemachus are not strongly associated with each other, at least not as much as Ulysses and Penelope. Furthermore, notice how Telemachus is not opposite his parents, Ulysses and Penelope.&lt;/li&gt;
  2393.  &lt;/ul&gt;
  2394.  &lt;p&gt;The modeling and visualizing of texts in the manner outlined above does not convey truth. Instead, what is conveyed are measurements and observations. It it up to the student, researcher, or scholar to interpret the measurements and observations. Personally, I see violent words between Achilles and Hector. I see words of authority between Ulysses and Achilles. I see non-violent words between Penelope, Telemachus, and Ulysses. All of these observations can then lead to discussion, which is the whole point of reading, in my opinion.&lt;/p&gt;
  2395.  &lt;h2&gt;Modeling bibliographics&lt;/h2&gt;
  2396.  &lt;p&gt;A more concrete way to model texts as network graphs is through bibliographics. Authors write things, and the things have titles. The titles are about any number of topics, and those topics can be denoted as keywords. Given these types of nodes all sort of new questions can be addressed:&lt;/p&gt;
  2397.  &lt;ul&gt;
  2398.    &lt;li&gt;Who wrote what, and what did they write about?&lt;/li&gt;
  2399.    &lt;li&gt;Who is the most prolific author?&lt;/li&gt;
  2400.    &lt;li&gt;What are the items in the corpus about?&lt;/li&gt;
  2401.    &lt;li&gt;Across the corpus, what themes predominate?&lt;/li&gt;
  2402.    &lt;li&gt;To what degree do the items in the corpus cluster into groups, and if they do, then what are those groups?&lt;/li&gt;
  2403.  &lt;/ul&gt;
  2404.  &lt;p&gt;In the case of the Iliad and the Odyssey, there are forty-eight books (&quot;chapters&quot;). They are all written by the same person -- Homer. Using a variation of the venerable Term-Frequency Inverse Document Frequency (TFIDF) algorithm, I can compute and associate statistically significant keywords for each chapter. In this case, I save the node combinations as a Graph Modeling Language (GML) file, which just a more expressive version of an edges table. After importing the file into Gephi, and after taking various properties into account, I can visualize the graph in the following way:&lt;/p&gt;
  2405.  &lt;p style=&quot;text-align: center;&quot;&gt;
  2406.    &lt;img alt=&quot;./two-stories-small.png&quot; src=&quot;https://distantreader.org/blog/network-graphs-are-fun/two-stories-small.png&quot; /&gt;
  2407.    &lt;br /&gt;
  2408.    &lt;a href=&quot;https://distantreader.org/blog/network-graphs-are-fun/two-stories.png&quot;&gt;Bibliographic network of the Iliad and the Odyssey&lt;/a&gt;
  2409.  &lt;/p&gt;
  2410.  &lt;p&gt;From the visualization I can assert a number of things:&lt;/p&gt;
  2411.  &lt;ol&gt;
  2412.    &lt;li&gt;The corpus is about &quot;trojans&quot;, &quot;achaeans&quot;, &quot;ships&quot;, &quot;hector&quot;, &quot;jove&quot;, &quot;suitors&quot;, &quot;father&quot;, &quot;home&quot;, etc.&lt;/li&gt;
  2413.    &lt;li&gt;After applying a clustering technique, I see the corpus falls into two distinct groups. One of the groups is the Iliad and the other is the Odyssey. And no, these groups were not generated by the names of titles but instead by their shared keywords.&lt;/li&gt;
  2414.    &lt;li&gt;Since the word &quot;man&quot; is pretty much in the center of the graph, I assert that &quot;man&quot; is common theme throughout the corpus.&lt;/li&gt;
  2415.  &lt;/ol&gt;
  2416.  &lt;p&gt;As outlined above &quot;betweenness&quot; is a property of graphs. Nodes are denoted as having a high degree of betweenness if one has to go through them to get to other nodes. Such nodes have a certain significance. Kinda like a social network, if there a person who connects many other people, then that person is a good person to know. After calculating betweenness and removing all of the nodes with a betweenness value of zero, the following nodes remain:&lt;/p&gt;
  2417.  &lt;p style=&quot;text-align: center;&quot;&gt;
  2418.    &lt;img alt=&quot;./betweenesses-small.png&quot; src=&quot;https://distantreader.org/blog/network-graphs-are-fun/betweenesses-small.png&quot; /&gt;
  2419.    &lt;br /&gt;
  2420.    &lt;a href=&quot;https://distantreader.org/blog/network-graphs-are-fun/betweenesses.png&quot;&gt;Nodes with non-zero betweeness values&lt;/a&gt;
  2421.  &lt;/p&gt;
  2422.  &lt;p&gt;I assert if one were to focus their attention on these remaining keywords and chapters, then one would be get a pretty good understanding of the corpus.&lt;/p&gt;
  2423.  &lt;h2&gt;Modeling multiple authors&lt;/h2&gt;
  2424.  &lt;p&gt;The Iliad and the Odyssey were written by a single person, but the process becomes much more interesting when multiple authors are combined into the same graph. Such is what I did below where I modeled the Iliad and the Odyssey (by Homer), Walden Pond (by Thoreau), and Emma (by Austen):&lt;/p&gt;
  2425.  &lt;p style=&quot;text-align: center;&quot;&gt;
  2426.    &lt;img alt=&quot;./three-authors-small.png&quot; src=&quot;https://distantreader.org/blog/network-graphs-are-fun/three-authors-small.png&quot; /&gt;
  2427.    &lt;br /&gt;
  2428.    &lt;a href=&quot;https://distantreader.org/blog/network-graphs-are-fun/three-authors.png&quot;&gt;Comparing Homer, Thoreau, and Austen&lt;/a&gt;
  2429.  &lt;/p&gt;
  2430.  &lt;p&gt;From the result I assert a number of things:&lt;/p&gt;
  2431.  &lt;ul&gt;
  2432.    &lt;li&gt;All three authors' works are distinctive.&lt;/li&gt;
  2433.    &lt;li&gt;Homer's works (still) fall into two distinct groups.&lt;/li&gt;
  2434.    &lt;li&gt;The works of Homer and of Thoreau are more similar because they share keywords such as &quot;house&quot;, &quot;man&quot;, and other things to a lesser degree.&lt;/li&gt;
  2435.    &lt;li&gt;In terms of aboutness, Austen's work is not like the works of Homer nor Thoreau.&lt;/li&gt;
  2436.  &lt;/ul&gt;
  2437.  &lt;p&gt;Granted, anybody familiar with the works of Homer, Thoreau, and/or Austen would say, &quot;So what. I already know that.&quot; And my reply would be two-fold. First, &quot;Yes, but what if you were not familiar with the works, then the illustration tell you a story and provides you with a sort of overview of the corpus.&quot; Second, &quot;Yes, I know that but now you are equipped with a larger vocabulary for use and understanding. For example, what words are obviously understood and what words are unexpected? Discuss.&quot;&lt;/p&gt;
  2438.  &lt;h2&gt;Summary&lt;/h2&gt;
  2439.  &lt;p&gt;Network graphs -- collections of nodes and edges -- are a useful thinggs for the use and understanding of narrative texts. They provide the means to measure and observe the content of a corpus in ways traditional reading does not afford.&lt;/p&gt;
  2440.  &lt;h2&gt;Epilouge&lt;/h2&gt;
  2441.  &lt;p&gt;This essay, the edges table, the two GML files, and the Gephi file used to create the visualizations ought to be availble as a zip file at:&lt;/p&gt;
  2442.  &lt;blockquote&gt;
  2443.    &lt;p&gt;
  2444.      &lt;a href=&quot;https://distantreader.org/blog/network-graphs-are-fun/network-graphs-are-fun.zip&quot;&gt;https://distantreader.org/blog/network-graphs-are-fun/network-graphs-are-fun.zip&lt;/a&gt;
  2445.    &lt;/p&gt;
  2446.  &lt;/blockquote&gt;
  2447. &lt;/div&gt;</content>
  2448.      <author>
  2449.    <name>Distant Reader Blog</name>
  2450. <uri>https://distantreader.org/blog/</uri>
  2451.      </author>
  2452.    </entry>
  2453.  
  2454.    <entry>
  2455.      <title>Web Science and Digital Libraries (WS-DL) Group at Old Dominion University: 2025-06-14: Reflecting on GAAD 2025 at ODU: A Deep‑Dive into &quot;Accessible by Design&quot;</title>
  2456.      <link rel="alternate" type="text/html" href="https://ws-dl.blogspot.com/2025/06/2025-06-14-reflecting-on-gaad2025-at.html"/>
  2457.      <id>tag:blogger.com,1999:blog-953024975153422094.post-8433127551271330657</id>
  2458.      <updated>2025-06-14T13:23:01+00:00</updated>
  2459.      <content type="html">&lt;p&gt;&lt;span style=&quot;font-family: arial;&quot;&gt; &lt;/span&gt;&lt;/p&gt;&lt;span style=&quot;font-family: arial;&quot;&gt;&lt;span id=&quot;docs-internal-guid-45a7d120-7fff-f42f-6f9f-e456d4ed0973&quot;&gt;&lt;h1 dir=&quot;ltr&quot; style=&quot;line-height: 1.38; margin-bottom: 6pt; margin-top: 14pt; text-align: center;&quot;&gt;&lt;div class=&quot;separator&quot; style=&quot;clear: both; text-align: center;&quot;&gt;&lt;a href=&quot;https://blogger.googleusercontent.com/img/a/AVvXsEgZ3jJPwcknSgzfYmm2IQZIsJoQDHth5ZQ50rRwXNDg5mwF93yXBtSDMAhXWhymunErP6yeGVTbBIMk5bUSlkDkRIImpg4AUwgVCFOGQU4l1Yqc7WwV9iQ3pcZnoe4yln4YPizH9aEtScBznvRtj8ZKXZp7b6Q1h9TA3i5qnT-yood7uBeAcbPZJxUES-0&quot; style=&quot;margin-left: 1em; margin-right: 1em;&quot;&gt;&lt;img alt=&quot;&quot; height=&quot;223&quot; src=&quot;https://blogger.googleusercontent.com/img/a/AVvXsEgZ3jJPwcknSgzfYmm2IQZIsJoQDHth5ZQ50rRwXNDg5mwF93yXBtSDMAhXWhymunErP6yeGVTbBIMk5bUSlkDkRIImpg4AUwgVCFOGQU4l1Yqc7WwV9iQ3pcZnoe4yln4YPizH9aEtScBznvRtj8ZKXZp7b6Q1h9TA3i5qnT-yood7uBeAcbPZJxUES-0=w609-h223&quot; width=&quot;609&quot; /&gt;&lt;/a&gt;&lt;/div&gt;&lt;/h1&gt;&lt;/span&gt;&lt;span&gt;&lt;p dir=&quot;ltr&quot; style=&quot;line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt; text-align: center;&quot;&gt;&lt;span style=&quot;font-size: x-small; vertical-align: baseline;&quot;&gt;Figure 1: A virtual slide from Old Dominion University's GAAD 2025 event highlights &quot;Global Accessibility Awareness Day (GAAD)&quot; during a hybrid session on inclusive design.&lt;/span&gt;&lt;/p&gt;&lt;p dir=&quot;ltr&quot; style=&quot;line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt; text-align: center;&quot;&gt;&lt;span face=&quot;&amp;quot;Quattrocento Sans&amp;quot;, sans-serif&quot; style=&quot;font-size: 9pt; vertical-align: baseline;&quot;&gt;&lt;br /&gt;&lt;/span&gt;&lt;/p&gt;&lt;p dir=&quot;ltr&quot; style=&quot;line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt; text-align: justify;&quot;&gt;&lt;span face=&quot;&amp;quot;Quattrocento Sans&amp;quot;, sans-serif&quot; style=&quot;font-size: 11pt; vertical-align: baseline;&quot;&gt;Celebrating &lt;/span&gt;&lt;a href=&quot;https://accessibility.day/&quot;&gt;&lt;span face=&quot;&amp;quot;Quattrocento Sans&amp;quot;, sans-serif&quot; style=&quot;color: #1155cc; font-size: 11pt; vertical-align: baseline;&quot;&gt;Global Accessibility Awareness Day (GAAD)&lt;/span&gt;&lt;/a&gt;&lt;span face=&quot;&amp;quot;Quattrocento Sans&amp;quot;, sans-serif&quot; style=&quot;font-size: 11pt; vertical-align: baseline;&quot;&gt;, &lt;a href=&quot;https://www.odu.edu/library&quot;&gt;Old Dominion University Libraries &lt;/a&gt;successfully hosted &quot;&lt;/span&gt;&lt;span face=&quot;&amp;quot;Quattrocento Sans&amp;quot;, sans-serif&quot; style=&quot;font-size: 11pt; font-style: italic; vertical-align: baseline;&quot;&gt;Accessible by Design: Creating Inclusive PDFs Using Adobe Acrobat&quot;&lt;/span&gt;&lt;span face=&quot;&amp;quot;Quattrocento Sans&amp;quot;, sans-serif&quot; style=&quot;font-size: 11pt; vertical-align: baseline;&quot;&gt; in collaboration with the recently established Employees with Disabilities Association (&lt;/span&gt;&lt;a href=&quot;https://www.odu.edu/employees-disabilities-association&quot;&gt;&lt;span face=&quot;&amp;quot;Quattrocento Sans&amp;quot;, sans-serif&quot; style=&quot;color: #1155cc; font-size: 11pt; vertical-align: baseline;&quot;&gt;EWDA&lt;/span&gt;&lt;/a&gt;&lt;span face=&quot;&amp;quot;Quattrocento Sans&amp;quot;, sans-serif&quot; style=&quot;font-size: 11pt; vertical-align: baseline;&quot;&gt;). Led by&lt;/span&gt;&lt;a href=&quot;https://www.odu.edu/directory/brett-cook-snell&quot;&gt;&lt;span face=&quot;&amp;quot;Quattrocento Sans&amp;quot;, sans-serif&quot; style=&quot;color: #1155cc; font-size: 11pt; vertical-align: baseline;&quot;&gt; Dr. Brett Cook-Snell&lt;/span&gt;&lt;/a&gt;&lt;span face=&quot;&amp;quot;Quattrocento Sans&amp;quot;, sans-serif&quot; style=&quot;font-size: 11pt; vertical-align: baseline;&quot;&gt;, this hybrid workshop brought together faculty, staff, and students both in person at &lt;/span&gt;&lt;a href=&quot;https://www.odu.edu/sites/default/files/2023/documents/floormap-perrylibrary-handout.pdf&quot;&gt;&lt;span face=&quot;&amp;quot;Quattrocento Sans&amp;quot;, sans-serif&quot; style=&quot;color: #1155cc; font-size: 11pt; vertical-align: baseline;&quot;&gt;Perry Library Room 1306&lt;/span&gt;&lt;/a&gt;&lt;span face=&quot;&amp;quot;Quattrocento Sans&amp;quot;, sans-serif&quot; style=&quot;font-size: 11pt; vertical-align: baseline;&quot;&gt; and online via Zoom for a thorough investigation of digital accessibility that went far beyond technical training to address fundamental questions about inclusion, ableism, and universal design. I attended the session virtually via Zoom, and in this blog post, I will share my firsthand experience and reflections. &lt;/span&gt;&lt;span style=&quot;font-size: 11pt; text-align: left; vertical-align: baseline;&quot;&gt;Since its launch in 2021 (Figure 2), when the first event was planned as a library-only endeavor, GAAD at ODU Libraries has evolved into a ritual. It has evolved over years into a campus-wide celebration with past cooperation involving the Office of Educational Accessibility and several academic departments. The event last year, for instance, concentrated on increasing web archive accessibility; as the &lt;/span&gt;&lt;a href=&quot;https://ws-dl.blogspot.com/2024/06/2024-06-04-making-web-archives-more.html&quot; style=&quot;text-align: left;&quot;&gt;&lt;span style=&quot;color: #1155cc; font-size: 11pt; vertical-align: baseline;&quot;&gt;WS-DL blog&lt;/span&gt;&lt;/a&gt;&lt;span style=&quot;font-size: 11pt; text-align: left; vertical-align: baseline;&quot;&gt; summarizes, &quot;&lt;/span&gt;&lt;span style=&quot;font-size: 11pt; font-style: italic; text-align: left; vertical-align: baseline;&quot;&gt;Accessibility is not a one-time fix but an ongoing process that requires continuous evaluation and improvement&lt;/span&gt;&lt;span style=&quot;font-size: 11pt; text-align: left; vertical-align: baseline;&quot;&gt;.&quot; Reiterating ODU's dedication to creating a community of education, advocacy, and connectedness among all faculty and staff, regardless of disability status, this year's collaboration with EWDA marked yet another forward step.&lt;/span&gt;&lt;/p&gt;&lt;br /&gt;&lt;p dir=&quot;ltr&quot; style=&quot;line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt; text-align: justify;&quot;&gt;&lt;a href=&quot;https://x.com/ODULibraries/status/1920852928044146746&quot;&gt;&lt;/a&gt;&lt;/p&gt;&lt;blockquote class=&quot;twitter-tweet&quot;&gt;&lt;p dir=&quot;ltr&quot; lang=&quot;en&quot;&gt;&lt;a href=&quot;https://x.com/ODULibraries/status/1920852928044146746&quot;&gt;Join us in recognizing Global Accessibility Awareness Day (GAAD) on May 15 from 12-2pm for the workshop: Accessible by Design! Attend in person in Perry Library room 1306 (Lunch provided for all in-person attendees!) or online via Zoom💙♿️💻&lt;br /&gt;&lt;br /&gt;Zoom link: &lt;/a&gt;&lt;a href=&quot;https://t.co/IGu1fp3GqX&quot;&gt;https://t.co/IGu1fp3GqX&lt;/a&gt; &lt;a href=&quot;https://t.co/8nLoBfSPzI&quot;&gt;pic.twitter.com/8nLoBfSPzI&lt;/a&gt;&lt;/p&gt;— ODU Libraries (@ODULibraries) &lt;a href=&quot;https://twitter.com/ODULibraries/status/1920852928044146746?ref_src=twsrc%5Etfw&quot;&gt;May 9, 2025&lt;/a&gt;&lt;/blockquote&gt; &lt;h3 style=&quot;text-align: left;&quot;&gt;&lt;span&gt;&lt;br /&gt;&lt;/span&gt;&lt;/h3&gt;&lt;h3 style=&quot;text-align: left;&quot;&gt;&lt;span&gt;Constructing Community via Cooperation and Leadership&lt;/span&gt;&lt;/h3&gt;&lt;div&gt;&lt;span&gt;&lt;p dir=&quot;ltr&quot; style=&quot;line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt; text-align: justify;&quot;&gt;&lt;span&gt;&lt;span face=&quot;&amp;quot;Quattrocento Sans&amp;quot;, sans-serif&quot; style=&quot;font-size: 11pt; vertical-align: baseline;&quot;&gt;The session began with strong reflections on the universality of disability; instead of merely explaining why accessibility matters, it aimed to demonstrate how to do the work step by step, including its challenges. As speaker, &lt;a href=&quot;https://www.odu.edu/directory/tim-hackman&quot;&gt;Tim Hackman&lt;/a&gt;, Dean of University Libraries, underlined  that  “&lt;/span&gt;&lt;span face=&quot;&amp;quot;Quattrocento Sans&amp;quot;, sans-serif&quot; style=&quot;font-size: 11pt; font-style: italic; vertical-align: baseline;&quot;&gt;We're all only temporarily able-bodied, and disability touches every family eventually, so the real question is: Will the tools we build today still welcome us tomorrow?”&lt;/span&gt;&lt;span face=&quot;&amp;quot;Quattrocento Sans&amp;quot;, sans-serif&quot; style=&quot;font-size: 11pt; vertical-align: baseline;&quot;&gt; Head of systems development at the library &lt;/span&gt;&lt;a href=&quot;https://www.odu.edu/directory/holli-kubly&quot;&gt;&lt;span face=&quot;&amp;quot;Quattrocento Sans&amp;quot;, sans-serif&quot; style=&quot;color: #1155cc; font-size: 11pt; vertical-align: baseline;&quot;&gt;Holly Kubli&lt;/span&gt;&lt;/a&gt;&lt;span face=&quot;&amp;quot;Quattrocento Sans&amp;quot;, sans-serif&quot; style=&quot;font-size: 11pt; vertical-align: baseline;&quot;&gt; gave insightful background on ODU's increasing dedication to GAAD, noting that this season marked the university's fifth year commemorating the day, from a library-only project in 2021 to a campus-wide cooperative effort. Combining advocates and practitioners from all across the university to foster &quot;&lt;/span&gt;&lt;span face=&quot;&amp;quot;Quattrocento Sans&amp;quot;, sans-serif&quot; style=&quot;font-size: 11pt; font-style: italic; vertical-align: baseline;&quot;&gt;a community of education, advocacy, and connectedness among faculty and staff, regardless of disability status, to address shared challenges, promote inclusivity, and enhance accessibility across the university,&lt;/span&gt;&lt;span face=&quot;&amp;quot;Quattrocento Sans&amp;quot;, sans-serif&quot; style=&quot;font-size: 11pt; vertical-align: baseline;&quot;&gt;&quot; the collaboration with EWDA marked a notable extension of these efforts. Identifying himself as queer, bipolar, and neurodiverse while acknowledging the land of the Powhatan Confederacy and Virginia's role in historical injustices, Dr. Cook-Snell, the recently elected president of EWDA, brought both technical expertise and lived experience to the workshop.  Emphasizing personal commitment to challenge university policies, his approach linked the work of accessibility to more general social justice initiatives.  &lt;/span&gt;&lt;a href=&quot;https://www.odu.edu/directory/emily-harmon&quot;&gt;&lt;span face=&quot;&amp;quot;Quattrocento Sans&amp;quot;, sans-serif&quot; style=&quot;color: #1155cc; font-size: 11pt; vertical-align: baseline;&quot;&gt;Emily Harmon&lt;/span&gt;&lt;/a&gt;&lt;span face=&quot;&amp;quot;Quattrocento Sans&amp;quot;, sans-serif&quot; style=&quot;font-size: 11pt; vertical-align: baseline;&quot;&gt;, vice president, who Dr. Cook-Snell credited as &quot;the brains behind this operation,&quot; and &lt;/span&gt;&lt;a href=&quot;https://www.odu.edu/directory/kristin-osborne&quot;&gt;&lt;span face=&quot;&amp;quot;Quattrocento Sans&amp;quot;, sans-serif&quot; style=&quot;color: #1155cc; font-size: 11pt; vertical-align: baseline;&quot;&gt;Kristen Osborne,&lt;/span&gt;&lt;/a&gt;&lt;span face=&quot;&amp;quot;Quattrocento Sans&amp;quot;, sans-serif&quot; style=&quot;font-size: 11pt; vertical-align: baseline;&quot;&gt; mentoring chair, helped to ensure virtual attendees stayed fully engaged throughout the session by facilitating correspondence between in-person and online participants (Figure 3).&lt;/span&gt;&lt;/span&gt;&lt;/p&gt;&lt;span&gt;&lt;br /&gt;&lt;/span&gt;&lt;p dir=&quot;ltr&quot; style=&quot;line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt; text-align: justify;&quot;&gt;&lt;/p&gt;&lt;blockquote class=&quot;twitter-tweet&quot;&gt;&lt;p dir=&quot;ltr&quot; lang=&quot;en&quot;&gt;Today, &lt;a href=&quot;https://twitter.com/ODULibraries?ref_src=twsrc%5Etfw&quot;&gt;@ODULibraries&lt;/a&gt; celebrated &lt;a href=&quot;https://twitter.com/hashtag/GlobalAccessibilityAwarenessDay?src=hash&amp;amp;ref_src=twsrc%5Etfw&quot;&gt;#GlobalAccessibilityAwarenessDay&lt;/a&gt; (&lt;a href=&quot;https://twitter.com/hashtag/GAAD?src=hash&amp;amp;ref_src=twsrc%5Etfw&quot;&gt;#GAAD&lt;/a&gt;) with our &quot;Accessible by Design&quot; workshop, helping participants create more inclusive digital content! &lt;br /&gt;&lt;br /&gt;/cc &lt;a href=&quot;https://twitter.com/WebSciDL?ref_src=twsrc%5Etfw&quot;&gt;@WebSciDL&lt;/a&gt; &lt;a href=&quot;https://twitter.com/ODU?ref_src=twsrc%5Etfw&quot;&gt;@ODU&lt;/a&gt; &lt;a href=&quot;https://twitter.com/ODUSCI?ref_src=twsrc%5Etfw&quot;&gt;@ODUSCI&lt;/a&gt; &lt;a href=&quot;https://twitter.com/oducs?ref_src=twsrc%5Etfw&quot;&gt;@oducs&lt;/a&gt; &lt;a href=&quot;https://twitter.com/accessodu?ref_src=twsrc%5Etfw&quot;&gt;@accessodu&lt;/a&gt; &lt;a href=&quot;https://t.co/n32QL8MwB5&quot;&gt;pic.twitter.com/n32QL8MwB5&lt;/a&gt;&lt;/p&gt;— Md Javedul Ferdous (@jaf_ferdous) &lt;a href=&quot;https://twitter.com/jaf_ferdous/status/1923064357765742692?ref_src=twsrc%5Etfw&quot;&gt;May 15, 2025&lt;/a&gt;&lt;/blockquote&gt; &lt;/span&gt;&lt;h3 style=&quot;text-align: left;&quot;&gt;&lt;span&gt;&lt;br /&gt;&lt;/span&gt;&lt;/h3&gt;&lt;h3 style=&quot;text-align: left;&quot;&gt;&lt;span&gt;Recognizing Systemic Barriers and Ableism&lt;/span&gt;&lt;/h3&gt;&lt;/div&gt;&lt;/span&gt;&lt;/span&gt;&lt;div&gt;&lt;span&gt;&lt;span&gt;&lt;p dir=&quot;ltr&quot; style=&quot;line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt; text-align: justify;&quot;&gt;&lt;span style=&quot;font-family: arial;&quot;&gt;&lt;span face=&quot;&amp;quot;Quattrocento Sans&amp;quot;, sans-serif&quot; style=&quot;font-size: 11pt; vertical-align: baseline;&quot;&gt;The workshop included a crucial instructional component on ableism and systemic barriers, which resonated with attendees before we began the technical training. From homes to businesses, grocery stores, and even family dynamics, the conversation looked at where ableism shows up in daily life. Participants offered careful examples covering the structural obstacles in education and the high cost of access tools to the planning stage when accessibility issues are sometimes disregarded.  One especially poignant illustration of how inclusion calls for deliberate attention even in close quarters was a participant reminding family members to say goodbye to a nonverbal nephew. The discussion touched on several forms of ableism, including the use of euphemistic language like &quot;&lt;/span&gt;&lt;span face=&quot;&amp;quot;Quattrocento Sans&amp;quot;, sans-serif&quot; style=&quot;font-size: 11pt; font-style: italic; vertical-align: baseline;&quot;&gt;differently abled,&lt;/span&gt;&lt;span face=&quot;&amp;quot;Quattrocento Sans&amp;quot;, sans-serif&quot; style=&quot;font-size: 11pt; vertical-align: baseline;&quot;&gt;&quot; microaggressions like &quot;&lt;/span&gt;&lt;span face=&quot;&amp;quot;Quattrocento Sans&amp;quot;, sans-serif&quot; style=&quot;font-size: 11pt; font-style: italic; vertical-align: baseline;&quot;&gt;my OCD is kicking in,&lt;/span&gt;&lt;span face=&quot;&amp;quot;Quattrocento Sans&amp;quot;, sans-serif&quot; style=&quot;font-size: 11pt; vertical-align: baseline;&quot;&gt;&quot; and presumptions about disabled people's capacities that result in either condescension or lowered expectations.  Participants highlighted the range of accessibility issues that exist throughout university operations by citing anything from the absence of Braille printers to rigid work rules.  This basis enabled one to link individual actions to more general concepts of equity and inclusion, so helping to explain why technical accessibility training matters beyond simple compliance.&lt;/span&gt;&lt;/span&gt;&lt;/p&gt;&lt;span style=&quot;font-family: arial;&quot;&gt;&lt;div class=&quot;separator&quot; style=&quot;clear: both; text-align: center;&quot;&gt;&lt;a href=&quot;https://blogger.googleusercontent.com/img/a/AVvXsEhJDGOKbK7VuCOrT6WWzyVT6sqVJdChKRAlsTxMwgV_V2GgbXnNHr4r9SnRIOxa52gswLt7keJjiNjWhYJ1MYUgErCMcMV5_5pRTaRyHOSrEL-k9oFfdxXqmv_AHCH-GI1kjNcU7K0sy776et6ga_CZ9ncaGpNDVSEGoxh7JRdJtN6lMUXv8Q2ceVmbTnM&quot; style=&quot;margin-left: 1em; margin-right: 1em;&quot;&gt;&lt;img alt=&quot;&quot; height=&quot;296&quot; src=&quot;https://blogger.googleusercontent.com/img/a/AVvXsEhJDGOKbK7VuCOrT6WWzyVT6sqVJdChKRAlsTxMwgV_V2GgbXnNHr4r9SnRIOxa52gswLt7keJjiNjWhYJ1MYUgErCMcMV5_5pRTaRyHOSrEL-k9oFfdxXqmv_AHCH-GI1kjNcU7K0sy776et6ga_CZ9ncaGpNDVSEGoxh7JRdJtN6lMUXv8Q2ceVmbTnM=w619-h296&quot; width=&quot;619&quot; /&gt;&lt;/a&gt;&lt;/div&gt;&lt;div style=&quot;text-align: center;&quot;&gt;&lt;span&gt;&lt;span&gt;&lt;span style=&quot;font-family: arial; font-size: x-small;&quot;&gt;Figure 2: &lt;/span&gt;&lt;/span&gt;&lt;/span&gt;&lt;span style=&quot;font-size: x-small;&quot;&gt;A look back at GAAD celebrations at ODU Libraries: From 2021 to 2024&lt;/span&gt;&lt;/div&gt;&lt;div style=&quot;text-align: center;&quot;&gt;&lt;span style=&quot;font-size: x-small;&quot;&gt;&lt;br /&gt;&lt;/span&gt;&lt;/div&gt;&lt;/span&gt;&lt;/span&gt;&lt;/span&gt;&lt;h3 style=&quot;text-align: left;&quot;&gt;&lt;span style=&quot;font-family: arial;&quot;&gt;Technical instruction and practical learning&lt;/span&gt;&lt;/h3&gt;&lt;/div&gt;&lt;div&gt;&lt;span&gt;&lt;span&gt;&lt;p dir=&quot;ltr&quot; style=&quot;line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt; text-align: justify;&quot;&gt;&lt;span style=&quot;font-family: arial;&quot;&gt;&lt;span face=&quot;&amp;quot;Quattrocento Sans&amp;quot;, sans-serif&quot; style=&quot;font-size: 11pt; vertical-align: baseline;&quot;&gt;With Dr. Cook-Snell guiding participants through key steps including auto-tagging documents, adding alternative text to images, ensuring proper reading order, and making interactive forms navigable by assistive technologies, the technical part of the workshop concentrated on practical skills using &lt;/span&gt;&lt;a href=&quot;https://www.adobe.com/acrobat/acrobat-pro.html&quot;&gt;&lt;span face=&quot;&amp;quot;Quattrocento Sans&amp;quot;, sans-serif&quot; style=&quot;color: #1155cc; font-size: 11pt; vertical-align: baseline;&quot;&gt;Adobe Acrobat Pro&lt;/span&gt;&lt;/a&gt;&lt;span face=&quot;&amp;quot;Quattrocento Sans&amp;quot;, sans-serif&quot; style=&quot;font-size: 11pt; vertical-align: baseline;&quot;&gt; creating accessible PDFs.  Emphasizing Web Content Accessibility Guidelines (&lt;/span&gt;&lt;a href=&quot;https://www.w3.org/WAI/standards-guidelines/wcag/&quot;&gt;&lt;span face=&quot;&amp;quot;Quattrocento Sans&amp;quot;, sans-serif&quot; style=&quot;color: #1155cc; font-size: 11pt; vertical-align: baseline;&quot;&gt;WCAG&lt;/span&gt;&lt;/a&gt;&lt;span face=&quot;&amp;quot;Quattrocento Sans&amp;quot;, sans-serif&quot; style=&quot;font-size: 11pt; vertical-align: baseline;&quot;&gt;), the training focused on referencing ODU's official IT policy mandating compliance with Section 508 of the &lt;/span&gt;&lt;a href=&quot;https://www.section508.gov/manage/laws-and-policies/&quot;&gt;&lt;span face=&quot;&amp;quot;Quattrocento Sans&amp;quot;, sans-serif&quot; style=&quot;color: #1155cc; font-size: 11pt; vertical-align: baseline;&quot;&gt;Rehabilitation Act&lt;/span&gt;&lt;/a&gt;&lt;span face=&quot;&amp;quot;Quattrocento Sans&amp;quot;, sans-serif&quot; style=&quot;font-size: 11pt; vertical-align: baseline;&quot;&gt;.  Although hand editing is usually required for best results, participants discovered that the auto-tag feature in Adobe Acrobat provides a great basis for accessibility. Important ideas, including contrast ratios, descriptive link text for screen readers, and the distinctions between decorative and functional images that call for alternative text descriptions, were discussed in the session.  Dr. Cook-Snell explained the need of appropriate document structure for screen reader navigation, showed participants how to negotiate the reading order panel, and displayed the accessibility checker feature.  Participants asked intelligent questions about managing workflow when documents need frequent revisions, juggling accessibility with security concerns for sensitive documents, and selecting suitable screen readers for testing throughout the technical instruction. Several participants said they had fresh respect for accessibility tools they had not known existed; one attendee said, &quot;&lt;/span&gt;&lt;span face=&quot;&amp;quot;Quattrocento Sans&amp;quot;, sans-serif&quot; style=&quot;font-size: 11pt; font-style: italic; vertical-align: baseline;&quot;&gt;I didn't know this tool existed.  I am thus actually quite eager to play about with it a little more.”&lt;/span&gt;&lt;span face=&quot;&amp;quot;Quattrocento Sans&amp;quot;, sans-serif&quot; style=&quot;font-size: 11pt; vertical-align: baseline;&quot;&gt;  Another participant verified that the second time around the process became &quot;&lt;/span&gt;&lt;span face=&quot;&amp;quot;Quattrocento Sans&amp;quot;, sans-serif&quot; style=&quot;font-size: 11pt; font-style: italic; vertical-align: baseline;&quot;&gt;slightly easier,&lt;/span&gt;&lt;span face=&quot;&amp;quot;Quattrocento Sans&amp;quot;, sans-serif&quot; style=&quot;font-size: 11pt; vertical-align: baseline;&quot;&gt;&quot; so motivating ongoing skill development.  The conversation exposed the difficulty of implementing accessibility policies as well as the sincere dedication of ODU community members—even in face of technological constraints—to create inclusive digital spaces.&lt;/span&gt;&lt;/span&gt;&lt;/p&gt;&lt;p dir=&quot;ltr&quot; style=&quot;line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt; text-align: center;&quot;&gt;&lt;span face=&quot;&amp;quot;Quattrocento Sans&amp;quot;, sans-serif&quot; style=&quot;font-size: 11pt; vertical-align: baseline;&quot;&gt;&lt;span style=&quot;border: none; display: inline-block; font-family: arial; height: 357px; overflow: hidden; width: 624px;&quot;&gt;&lt;img height=&quot;357&quot; src=&quot;https://lh7-rt.googleusercontent.com/docsz/AD_4nXdSL5Y-rlRzeY4dXgQsU6t0IS4byHxkXp_9tTtlTFk7BIaZgqwEMBHD5AD0H4X7KV5ngMPj4HVRAtzyCmlYHRZoT5CCWIehKls1-US5bdNmP3bB3doNpN2PuiRa6BCsHSfj8I4q?key=aPtzpROn2Fu6eW4esyucZA&quot; style=&quot;margin-left: 0px; margin-top: 0px;&quot; width=&quot;624&quot; /&gt;&lt;/span&gt;&lt;/span&gt;&lt;/p&gt;&lt;p dir=&quot;ltr&quot; style=&quot;line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt; text-align: center;&quot;&gt;&lt;span style=&quot;font-family: arial;&quot;&gt;&lt;span face=&quot;&amp;quot;Quattrocento Sans&amp;quot;, sans-serif&quot; style=&quot;font-size: 9pt; vertical-align: baseline;&quot;&gt;Figure 3: Image shows a hybrid session of the event in progress. In the top-left panel, a live video feed captures the in-person attendees at Perry Library Room 1306. The top-right panel features &lt;/span&gt;&lt;a href=&quot;https://www.odu.edu/directory/kristin-osborne&quot;&gt;&lt;span face=&quot;&amp;quot;Quattrocento Sans&amp;quot;, sans-serif&quot; style=&quot;color: #1155cc; font-size: 9pt; vertical-align: baseline;&quot;&gt;Kristin M. Osborne&lt;/span&gt;&lt;/a&gt;&lt;span face=&quot;&amp;quot;Quattrocento Sans&amp;quot;, sans-serif&quot; style=&quot;font-size: 9pt; vertical-align: baseline;&quot;&gt;, joining virtually from ODU Global. The bottom panel shows &lt;/span&gt;&lt;a href=&quot;https://www.odu.edu/directory/emily-harmon&quot;&gt;&lt;span face=&quot;&amp;quot;Quattrocento Sans&amp;quot;, sans-serif&quot; style=&quot;color: #1155cc; font-size: 9pt; vertical-align: baseline;&quot;&gt;Emily S. Harmon,&lt;/span&gt;&lt;/a&gt;&lt;span face=&quot;&amp;quot;Quattrocento Sans&amp;quot;, sans-serif&quot; style=&quot;font-size: 9pt; vertical-align: baseline;&quot;&gt; also participating remotely.&lt;/span&gt;&lt;/span&gt;&lt;/p&gt;&lt;span style=&quot;font-family: arial;&quot;&gt;&lt;br /&gt;&lt;/span&gt;&lt;/span&gt;&lt;/span&gt;&lt;h3 style=&quot;text-align: left;&quot;&gt;&lt;span style=&quot;font-family: arial;&quot;&gt;Community Input and Next Projects&lt;/span&gt;&lt;/h3&gt;&lt;/div&gt;&lt;div&gt;&lt;span&gt;&lt;span style=&quot;font-family: arial;&quot;&gt;&lt;p dir=&quot;ltr&quot; style=&quot;line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt; text-align: justify;&quot;&gt;&lt;span&gt;&lt;span face=&quot;&amp;quot;Quattrocento Sans&amp;quot;, sans-serif&quot; style=&quot;font-size: 11pt; vertical-align: baseline;&quot;&gt;Extensive comments gathered at the workshop will guide the next EWDA projects. Participants suggested several valuable improvements, including providing step-by-step checklists for reference during hands-on activities, sharing materials through multiple channels (&lt;a href=&quot;https://workspace.google.com/products/drive/&quot;&gt;Google Drive&lt;/a&gt;, &lt;a href=&quot;https://www.zoom.com/en/products/team-chat/&quot;&gt;Zoom chat&lt;/a&gt;) for better accessibility, implementing a &quot;flipped&quot; approach where participants attempt tasks independently before group problem-solving, offering progressive learning opportunities through multi-session series, and establishing regular &quot;&lt;i&gt;office hours&lt;/i&gt;&quot; for ongoing accessibility support. One participant's recommendation for a seminar series, &quot;&lt;/span&gt;&lt;span face=&quot;&amp;quot;Quattrocento Sans&amp;quot;, sans-serif&quot; style=&quot;font-size: 11pt; font-style: italic; vertical-align: baseline;&quot;&gt;I think this would be a great seminar and not just stopping here, but offering it in several stages...  Like today was intro, then perhaps we do another one in a month with an emphasis on a different component&quot;&lt;/span&gt;&lt;span face=&quot;&amp;quot;Quattrocento Sans&amp;quot;, sans-serif&quot; style=&quot;font-size: 11pt; vertical-align: baseline;&quot;&gt;. The event effectively exposed many attendees to EWDA's mission and ongoing projects, including developing mentoring programs, building communication channels via newsletters and social media, and forming advocacy committees to compile community comments on accessibility improvements. Encouragement of attendees to join EWDA's expanding membership—which welcomes allies dedicated to accessibility advocacy as well as workers with disabilities— to show their dedication to inclusive participation from the beginning, the company also declared an active logo design contest, inviting community members to provide graphic representations of their mission.&lt;/span&gt;&lt;/span&gt;&lt;/p&gt;&lt;span&gt;&lt;br /&gt;&lt;/span&gt;&lt;p dir=&quot;ltr&quot; style=&quot;line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt; text-align: justify;&quot;&gt;&lt;/p&gt;&lt;blockquote class=&quot;twitter-tweet&quot;&gt;&lt;p dir=&quot;ltr&quot; lang=&quot;en&quot;&gt;Thanks to everyone who helped us in recognizing Global Accessibility Awareness Day (GAAD) by attending in person or online for yesterday's workshop: Accessible by Design! 🩵💻♿️&lt;a href=&quot;https://twitter.com/hashtag/GlobalAccessibilityAwarenessDay?src=hash&amp;amp;ref_src=twsrc%5Etfw&quot;&gt;#GlobalAccessibilityAwarenessDay&lt;/a&gt; &lt;a href=&quot;https://twitter.com/hashtag/GAAD?src=hash&amp;amp;ref_src=twsrc%5Etfw&quot;&gt;#GAAD&lt;/a&gt; &lt;a href=&quot;https://t.co/JNIhX4zHRD&quot;&gt;pic.twitter.com/JNIhX4zHRD&lt;/a&gt;&lt;/p&gt;— ODU Libraries (@ODULibraries) &lt;a href=&quot;https://twitter.com/ODULibraries/status/1923391188104581287?ref_src=twsrc%5Etfw&quot;&gt;May 16, 2025&lt;/a&gt;&lt;/blockquote&gt; &lt;/span&gt;&lt;/span&gt;&lt;h3 style=&quot;text-align: left;&quot;&gt;&lt;span style=&quot;font-family: arial;&quot;&gt;&lt;br /&gt;&lt;/span&gt;&lt;/h3&gt;&lt;h3 style=&quot;text-align: left;&quot;&gt;&lt;span style=&quot;font-family: arial;&quot;&gt;Looking Ahead: Institutional Change and Continuous Impact&lt;/span&gt;&lt;/h3&gt;&lt;/div&gt;&lt;div&gt;&lt;span&gt;&lt;span&gt;&lt;p dir=&quot;ltr&quot; style=&quot;line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt; text-align: justify;&quot;&gt;&lt;span style=&quot;font-family: arial;&quot;&gt;&lt;span face=&quot;&amp;quot;Quattrocento Sans&amp;quot;, sans-serif&quot; style=&quot;font-size: 11pt; vertical-align: baseline;&quot;&gt;Beyond the immediate attendees, the success of the workshop attracted interest from Dr. Cook-Snell, who mentioned Provost &lt;a href=&quot;https://www.odu.edu/directory/brian-payne&quot;&gt;Payne's&lt;/a&gt; expressed desire in bringing comparable programming to the university level, and from Dean &lt;a href=&quot;https://www.odu.edu/education/about/about-the-dean&quot;&gt;Dice&lt;/a&gt; of the College of Education, who underlined the importance of &quot;&lt;/span&gt;&lt;span face=&quot;&amp;quot;Quattrocento Sans&amp;quot;, sans-serif&quot; style=&quot;font-size: 11pt; font-style: italic; vertical-align: baseline;&quot;&gt;actionable items&lt;/span&gt;&lt;span face=&quot;&amp;quot;Quattrocento Sans&amp;quot;, sans-serif&quot; style=&quot;font-size: 11pt; vertical-align: baseline;&quot;&gt;&quot; that can significantly challenge university community barriers.  This awareness reflects the larger momentum around accessibility that defined GAAD 2025 globally, as organizations all around concentrated on &quot;&lt;/span&gt;&lt;span face=&quot;&amp;quot;Quattrocento Sans&amp;quot;, sans-serif&quot; style=&quot;font-size: 11pt; font-style: italic; vertical-align: baseline;&quot;&gt;Committing to Change&lt;/span&gt;&lt;span face=&quot;&amp;quot;Quattrocento Sans&amp;quot;, sans-serif&quot; style=&quot;font-size: 11pt; vertical-align: baseline;&quot;&gt;&quot; using pragmatic, hands-on methods to turn awareness into actual accessibility improvements. The popularity of &quot;Accessible By Design&quot; shows that effective, meaningful accessibility training calls for both consciousness-raising about disability inclusion and technical skill-building about accessibility.  The workshop laid a basis for ongoing institutional change rather than one-time awareness by combining useful tools with more general conversations of ableism and institutional barriers. While the passion and careful participation shown by participants suggest continuous interest in developing accessibility competencies, Dr. Cook-Snell promised to create more thorough written guides based on participant feedback and to set regular office hours for ongoing accessibility support.  Notwithstanding technical difficulties, the hybrid format effectively modeled inclusive programming by allowing both in-person and remote participants to interact with the material, so offering a template for the next EWDA projects. Accessibility is everyone's responsibility, as GAAD 2025 reminds us globally; ODU's workshop gave community members the tools and inspiration to make accessibility a top concern in their daily work, so transcending compliance towards real inclusion.&lt;/span&gt;&lt;/span&gt;&lt;/p&gt;&lt;span style=&quot;font-family: arial;&quot;&gt;&lt;br /&gt;&lt;br /&gt;&lt;br /&gt;&lt;/span&gt;&lt;p dir=&quot;ltr&quot; style=&quot;line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;&quot;&gt;&lt;span style=&quot;font-family: arial;&quot;&gt;&lt;span face=&quot;&amp;quot;Quattrocento Sans&amp;quot;, sans-serif&quot; style=&quot;font-size: 11pt; vertical-align: baseline;&quot;&gt;-- &lt;/span&gt;&lt;a href=&quot;https://cs.odu.edu/~cs_mferd002&quot;&gt;&lt;span face=&quot;&amp;quot;Quattrocento Sans&amp;quot;, sans-serif&quot; style=&quot;color: #1155cc; font-size: 11pt; vertical-align: baseline;&quot;&gt;Md Javedul Ferdous&lt;/span&gt;&lt;/a&gt;&lt;span face=&quot;&amp;quot;Quattrocento Sans&amp;quot;, sans-serif&quot; style=&quot;font-size: 11pt; vertical-align: baseline;&quot;&gt; (@&lt;/span&gt;&lt;a href=&quot;https://twitter.com/jaf_ferdous&quot;&gt;&lt;span face=&quot;&amp;quot;Quattrocento Sans&amp;quot;, sans-serif&quot; style=&quot;color: #1155cc; font-size: 11pt; vertical-align: baseline;&quot;&gt;jaf_ferdous&lt;/span&gt;&lt;/a&gt;&lt;span face=&quot;&amp;quot;Quattrocento Sans&amp;quot;, sans-serif&quot; style=&quot;font-size: 11pt; vertical-align: baseline;&quot;&gt;)&lt;/span&gt;&lt;/span&gt;&lt;/p&gt;&lt;span style=&quot;font-family: arial;&quot;&gt;&lt;br /&gt;&lt;br /&gt;&lt;br /&gt;&lt;div style=&quot;text-align: justify;&quot;&gt;&lt;h4&gt;Acknowledgment&lt;/h4&gt;&lt;div&gt;I'd like to acknowledge &lt;a href=&quot;https://www.cs.odu.edu/~mln/&quot;&gt;Dr. Michael Nelson&lt;/a&gt; for his indispensable support in reviewing my blog. His proficiency and insights significantly enhanced its quality.&lt;/div&gt;&lt;/div&gt;&lt;div style=&quot;text-align: justify;&quot;&gt; &lt;/div&gt;&lt;/span&gt;&lt;br /&gt;&lt;p&gt;&lt;/p&gt;&lt;/span&gt;&lt;/span&gt;&lt;/div&gt;&lt;span&gt;&lt;p&gt;&lt;/p&gt;&lt;/span&gt;</content>
  2460.      <author>
  2461.  <name>Javed (noreply@blogger.com)</name>
  2462. <uri>https://ws-dl.blogspot.com/</uri>
  2463.      </author>
  2464.    </entry>
  2465.  
  2466.    <entry>
  2467.      <title>Web Science and Digital Libraries (WS-DL) Group at Old Dominion University: 2025-05-06: Part 2 - Large Language Models (LLMs) are hallucinating in Arabic about the Quran (Open AI ChatGPT-4o)</title>
  2468.      <link rel="alternate" type="text/html" href="https://ws-dl.blogspot.com/2025/05/2025-05-06-part-2-large-language-models.html"/>
  2469.      <id>tag:blogger.com,1999:blog-953024975153422094.post-9219854499371177503</id>
  2470.      <updated>2025-06-13T17:10:10+00:00</updated>
  2471.      <content type="html">&lt;div class=&quot;separator&quot;&gt;&lt;div class=&quot;separator&quot; style=&quot;margin-left: 1em; margin-right: 1em; text-align: center;&quot;&gt;&lt;img border=&quot;0&quot; src=&quot;https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjTsD6LJxA3cd3qh1EtvTBQmqlUR6sl95-AXEl5a65ahT-y3KITIolzoSxZRZ3jnfucIWC8K9cDHX3E-0XBaI7AniEuYkp_gXrcPeQVdhHdnCVYRoMpDff1povCqCz3yD3VdHlFFN2zmsa38Delk-ckODsvGQ-5r0jhullv6g5CzfA4jcIcq6x1uNH47bHm/s16000/logo%202.png&quot; /&gt;&lt;/div&gt;&lt;/div&gt;&lt;h2&gt;Introduction&lt;/h2&gt;&lt;p&gt;I discussed LLMs' hallucinations and demonstrated their inability to correctly cite Quranic verses using Google Gemini as an example in &lt;a href=&quot;https://ws-dl.blogspot.com/2025/05/2025-01-25-part-1-large-language-models.html&quot; target=&quot;_blank&quot;&gt;part 1&lt;/a&gt; of this blog post. Google Gemini's answers to different prompts included incorrect Quranic citations and phrases that do not exist in the Quran attributed to specific verses in the Quran. I also explained the importance of correctly citing and presenting Quranic verses in Arabic. Here, in part 2, I present a comparison between Google Gemini and Open AI's ChatGPT. Using the same prompts, ChatGPT-4o hallucinations produced similar results to that of Google Gemini. Again, misinformation remains one of the biggest challenges that compromises LLMs' credibility quoting verses from the Quran in Arabic. Experts in Classical Arabic and those who memorize the Quran word for word can identify errors in ChatGPT's output, however, less educated people will have difficulty distinguishing between correct and incorrect answers from ChatGPT. Some Arabic speakers have examined ChatGPT's answers and found them to have &lt;a href=&quot;https://www.youtube.com/watch?v=b2fNodiU6JU&quot; target=&quot;_blank&quot;&gt;significant errors when referencing the Quran&lt;/a&gt;.&lt;/p&gt;&lt;h2&gt;Improvements in ChatGPT over the last two years&lt;/h2&gt;&lt;p&gt;The &lt;a href=&quot;https://doi.org/10.1080/14746700.2023.2255944&quot; target=&quot;_blank&quot;&gt;paper&lt;/a&gt; I reviewed in &lt;a href=&quot;https://ws-dl.blogspot.com/2025/05/2025-01-25-part-1-large-language-models.html&quot; target=&quot;_blank&quot;&gt;part 1 of this post&lt;/a&gt; tested ChatGPT-3.5 in 2023 using a sample of prompts regarding the Quran's views on misinformation. The authors found the output to have significant and consistent errors. They regenerated queries to the question “What does the Quran have to say about misinformation?”. They also sampled responses from ChatGPT with the additional prompt “please cite verses in Arabic”. The errors in the output ranged from producing correctly quoted and cited, yet irrelevant Quranic material to misordering the words of Quranic verses; the errors were not minor. They also found hallucinations in ChatGPT's answers attributing text that doesn't exist in the Quran at all to specific chapters and verses in the Quran. I repeated their experiment on &lt;a href=&quot;https://openai.com/index/gpt-4o-and-more-tools-to-chatgpt-free/&quot; target=&quot;_blank&quot;&gt;ChatGPT-4o&lt;/a&gt; in January 2025 and I was unable to reproduce their output. I did not get incorrect Quranic citations or verses that did not exist in the Quran. I did not get any altered verses from the Quran in any way. Every output was correct, accurate, and authentic. &lt;/p&gt;&lt;table align=&quot;center&quot; cellpadding=&quot;0&quot; cellspacing=&quot;0&quot; class=&quot;tr-caption-container&quot; style=&quot;margin-left: auto; margin-right: auto;&quot;&gt;&lt;tbody&gt;&lt;tr&gt;&lt;td style=&quot;text-align: center;&quot;&gt;&lt;a href=&quot;https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhPxvzbV1VLQHEJvp0KRsESVZrDzMdWmFvoA0-K4scOMqbvfZrNDTT2N4WHEZ3ZoDXyvrJt2Gh4TcjbpQM8nXuPWehyh4mR9JH6wAvTR3WcQJ04L9j9uxZapU9JtI5JLSCx0M3qnEfz8vdQxixV-RTijuUjcD3QEn5HKA0mpxwksScgJFx9SgQDhpTcmldI/s972/gpt1.PNG&quot; style=&quot;margin-left: auto; margin-right: auto;&quot;&gt;&lt;img alt=&quot;Correct answers from ChatGPT&quot; border=&quot;0&quot; src=&quot;https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhPxvzbV1VLQHEJvp0KRsESVZrDzMdWmFvoA0-K4scOMqbvfZrNDTT2N4WHEZ3ZoDXyvrJt2Gh4TcjbpQM8nXuPWehyh4mR9JH6wAvTR3WcQJ04L9j9uxZapU9JtI5JLSCx0M3qnEfz8vdQxixV-RTijuUjcD3QEn5HKA0mpxwksScgJFx9SgQDhpTcmldI/s16000/gpt1.PNG&quot; title=&quot;Correct answers from ChatGPT&quot; /&gt;&lt;/a&gt;&lt;/td&gt;&lt;/tr&gt;&lt;tr&gt;&lt;td class=&quot;tr-caption&quot; style=&quot;text-align: center;&quot;&gt;Correct answers from ChatGPT&lt;/td&gt;&lt;/tr&gt;&lt;/tbody&gt;&lt;/table&gt;&lt;p&gt;&lt;br /&gt;&lt;/p&gt;&lt;p&gt;However, when I repeated the experiment I did with Google Gemini in&lt;a href=&quot;https://ws-dl.blogspot.com/2025/05/2025-01-25-part-1-large-language-models.html&quot; target=&quot;_blank&quot;&gt; part 1 of this post&lt;/a&gt; in January of 2025 and asked ChatGPT for additional verses in Arabic, the results were not much better than the ones I got from Google Gemini. ChatGPT provided the wrong reference one time when I asked for additional verses on misinformation. In addition, ChatGPT provided phrases that do not exist in the Quran and attributed them to specific verses in the Quran when I tried different prompts that are not in the paper, but they have the same meaning. Although the errors in ChatGPT's output are not as bad as the ones I found in Google Gemini's, I think the impact is worse because the wrong verses are closer to correct verses in the Quran, and they can definitely be taken as correct verses to the non-expert. The difference was in one letter in one case and one word in another case. The output's syntax was also wrong in the latter.&lt;/p&gt;&lt;table align=&quot;center&quot; cellpadding=&quot;0&quot; cellspacing=&quot;0&quot; class=&quot;tr-caption-container&quot; style=&quot;margin-left: auto; margin-right: auto;&quot;&gt;&lt;tbody&gt;&lt;tr&gt;&lt;td style=&quot;text-align: center;&quot;&gt;&lt;a href=&quot;https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEj9pZsPw0ullIKgbUUwPEh4vfdh4PWmLo0Yx6n0UmeCcrsp5qTSpDvFnXapZwa1E61vEdlbxE0ko8LWzXCTlcL5QfFJ1rnJnV0Sj09TFoTog766HWbnl_CyhyphenhyphenN8_yQRKTiWNTIqPNTg1hM8-6X0oRJ8wjKwVVBiKf_xizGULHMm1RCVsTcieeXKS3V-2DYy/s928/gpt2.PNG&quot; style=&quot;margin-left: auto; margin-right: auto;&quot;&gt;&lt;img alt=&quot;Wrong chapter and verses' numbers from ChatGPT&quot; border=&quot;0&quot; src=&quot;https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEj9pZsPw0ullIKgbUUwPEh4vfdh4PWmLo0Yx6n0UmeCcrsp5qTSpDvFnXapZwa1E61vEdlbxE0ko8LWzXCTlcL5QfFJ1rnJnV0Sj09TFoTog766HWbnl_CyhyphenhyphenN8_yQRKTiWNTIqPNTg1hM8-6X0oRJ8wjKwVVBiKf_xizGULHMm1RCVsTcieeXKS3V-2DYy/s16000/gpt2.PNG&quot; title=&quot;Wrong chapter and verses' numbers from ChatGPT&quot; /&gt;&lt;/a&gt;&lt;/td&gt;&lt;/tr&gt;&lt;tr&gt;&lt;td class=&quot;tr-caption&quot; style=&quot;text-align: center;&quot;&gt;Wrong chapter and verses' numbers from ChatGPT&lt;/td&gt;&lt;/tr&gt;&lt;/tbody&gt;&lt;/table&gt;&lt;p&gt;&lt;span style=&quot;color: #b45f06;&quot;&gt;Me: Give me additional verses&lt;/span&gt;&lt;/p&gt;&lt;p&gt;ChatGPT:&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;Certainly! The Quran repeatedly emphasizes the dangers of misinformation, falsehoods, and irresponsible speech. Here are additional verses in Arabic that address this issue:&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;&lt;br /&gt;&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;6. Do Not Spread Falsehood Knowingly&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;📖 Surah Aal-e-Imran (3:78)&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;&lt;br /&gt;&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;وَإِنَّ مِنْهُمْ لَفَرِيقًا يَلْوُونَ أَلْسِنَتَهُمْ بِالْكِتَابِ لِتَحْسَبُوهُ مِنَ الْكِتَابِ وَمَا هُوَ مِنَ الْكِتَابِ وَيَقُولُونَ هُوَ مِنْ عِنْدِ اللَّهِ وَمَا هُوَ مِنْ عِنْدِ اللَّهِ وَيَقُولُونَ عَلَى اللَّهِ الْكَذِبَ وَهُمْ يَعْلَمُونَ&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;&lt;br /&gt;&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;🔹 &quot;And indeed, there is among them a party who alter the Scripture with their tongues so that you may think it is from the Scripture, but it is not from the Scripture. And they say, 'This is from Allah,' but it is not from Allah. And they speak a lie against Allah while they know [it].&quot;&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;&lt;br /&gt;&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;7. Lying Brings a Curse from Allah&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;📖 Surah Aal-e-Imran (3:61)&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;&lt;br /&gt;&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;فَمَنْ حَاجَّكَ فِيهِ مِنْ بَعْدِ مَا جَاءَكَ مِنَ الْعِلْمِ فَقُلْ تَعَالَوْا نَدْعُ أَبْنَاءَنَا وَأَبْنَاءَكُمْ وَنِسَاءَنَا وَنِسَاءَكُمْ وَأَنْفُسَنَا وَأَنْفُسَكُمْ ثُمَّ نَبْتَهِلْ فَنَجْعَلْ لَعْنَتَ اللَّهِ عَلَى الْكَاذِبِينَ&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;&lt;br /&gt;&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;🔹 &quot;Then whoever argues with you about it after knowledge has come to you, say, 'Come, let us call our sons and your sons, our women and your women, ourselves and yourselves, then supplicate earnestly and invoke the curse of Allah upon the liars.'&quot;&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;&lt;br /&gt;&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;8. Do Not Falsify Facts or Distort the Truth&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;📖 Surah Al-Ma’idah (5:41)&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;&lt;br /&gt;&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;يُحَرِّفُونَ الْكَلِمَ مِنْ بَعْدِ مَوَاضِعِهِ&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;&lt;br /&gt;&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;🔹 &quot;They distort words from their [proper] places...&quot;&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;&lt;br /&gt;&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;background-color: #f4cccc; color: #2b00fe;&quot;&gt;9. Lying is from Shaytan (Satan)&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;background-color: #f4cccc; color: #2b00fe;&quot;&gt;📖 Surah Al-Mujadilah (58:14-15)&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;background-color: #f4cccc; color: #2b00fe;&quot;&gt;&lt;br /&gt;&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;background-color: #f4cccc; color: #2b00fe;&quot;&gt;وَإِذَا رَأَيْتَهُمْ تُعْجِبُكَ أَجْسَامُهُمْ وَإِنْ يَقُولُوا تَسْمَعْ لِقَوْلِهِمْ كَأَنَّهُمْ خُشُبٌ مُسَنَّدَةٌ ۖ يَحْسَبُونَ كُلَّ صَيْحَةٍ عَلَيْهِمْ ۚ هُمُ الْعَدُوُّ فَاحْذَرْهُمْ ۚ قَاتَلَهُمُ اللَّهُ أَنَّىٰ يُؤْفَكُونَ&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;&lt;br /&gt;&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;🔹 &quot;And when you see them, their forms please you, and if they speak, you listen to their speech. [They are] like pieces of wood propped up – they think that every shout is against them. They are the enemy, so beware of them. May Allah destroy them; how are they deluded?&quot;&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;&lt;br /&gt;&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;10. Lying Leads to the Fire&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;📖 Surah Az-Zumar (39:60)&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;&lt;br /&gt;&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;وَيَوْمَ الْقِيَامَةِ تَرَى الَّذِينَ كَذَبُوا عَلَى اللَّهِ وُجُوهُهُمْ مُسْوَدَّةٌ ۚ أَلَيْسَ فِي جَهَنَّمَ مَثْوًى لِلْمُتَكَبِّرِينَ&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;&lt;br /&gt;&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;🔹 &quot;And on the Day of Resurrection, you will see those who lied against Allah – their faces will be blackened. Is there not in Hell a residence for the arrogant?&quot;&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;&lt;br /&gt;&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;11. Truthfulness Leads to Salvation&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;📖 Surah Al-Ahzab (33:70-71)&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;&lt;br /&gt;&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;يَا أَيُّهَا الَّذِينَ آمَنُوا اتَّقُوا اللَّهَ وَقُولُوا قَوْلًا سَدِيدًا ۝ يُصْلِحْ لَكُمْ أَعْمَالَكُمْ وَيَغْفِرْ لَكُمْ ذُنُوبَكُمْ ۗ وَمَنْ يُطِعِ اللَّهَ وَرَسُولَهُ فَقَدْ فَازَ فَوْزًا عَظِيمًا&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;&lt;br /&gt;&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;🔹 &quot;O you who have believed, fear Allah and speak words of appropriate justice. He will [then] amend for you your deeds and forgive you your sins. And whoever obeys Allah and His Messenger has certainly attained a great attainment.&quot;&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;&lt;br /&gt;&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;Conclusion&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;The Quran strongly condemns misinformation, distortion, and falsehood. It teaches that:&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;✅ Truthfulness leads to success&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;❌ Spreading falsehoods leads to regret and punishment&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;⚠️ Lies cause harm, injustice, and corruption in society&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;&lt;br /&gt;&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;Would you like a deeper explanation of any verse? 😊&lt;/span&gt;&lt;/p&gt;&lt;p&gt;The 9th quoted verse (highlighted in red) is a verse from the Quran, but it is not in Quran 58:14-15. It is in Quran 63:4. The two verses in Quran 58:14-15 are:&lt;/p&gt;&lt;p&gt; أَلَمْ تَرَ إِلَى الَّذِينَ تَوَلَّوْا قَوْمًا غَضِبَ اللَّهُ عَلَيْهِم مَّا هُم مِّنكُمْ وَلَا مِنْهُمْ وَيَحْلِفُونَ عَلَى الْكَذِبِ وَهُمْ يَعْلَمُونَ (14) أَعَدَّ اللَّهُ لَهُمْ عَذَابًا شَدِيدًا ۖ إِنَّهُمْ سَاءَ مَا كَانُوا يَعْمَلُونَ (15)&lt;/p&gt;&lt;p&gt;The translation is:&lt;/p&gt;&lt;p&gt;&lt;span&gt;Have you noted those who befriended people with whom GOD is angry? They neither belong with you, nor with them. They deliberately swear lies! (14) &lt;/span&gt;GOD has prepared for them a severe retribution. Miserable indeed is what they used to do. (15)&lt;/p&gt;&lt;p&gt;ChatGPT correctly cited the Quran 63:4 (highlighted green) in the following answer when I repeated the prompt.&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #b45f06;&quot;&gt;Me: Give me additional verses&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;ChatGPT:&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;Certainly! The Quran contains many verses warning against misinformation, lying, and spreading falsehoods. Here are additional verses in Arabic, along with their meanings:&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;&lt;br /&gt;&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;12. Lying is a Major Sin&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;📖 Surah Az-Zumar (39:32)&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;&lt;br /&gt;&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;فَمَنْ أَظْلَمُ مِمَّنْ كَذَبَ عَلَى اللَّهِ وَكَذَّبَ بِالصِّدْقِ إِذْ جَاءَهُ ۚ أَلَيْسَ فِي جَهَنَّمَ مَثْوًى لِلْكَافِرِينَ&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;&lt;br /&gt;&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;🔹 &quot;So who is more unjust than one who lies about Allah and denies the truth when it has come to him? Is there not in Hell a residence for the disbelievers?&quot;&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;&lt;br /&gt;&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;13. Speaking Falsehood is a Sign of Hypocrisy&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;📖 Surah At-Tawbah (9:77)&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;&lt;br /&gt;&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;فَأَعْقَبَهُمْ نِفَاقًا فِي قُلُوبِهِمْ إِلَىٰ يَوْمِ يَلْقَوْنَهُ بِمَا أَخْلَفُوا اللَّهَ مَا وَعَدُوهُ وَبِمَا كَانُوا يَكْذِبُونَ&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;&lt;br /&gt;&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;🔹 &quot;So He penalized them with hypocrisy in their hearts until the day they will meet Him – because they failed Allah in what they promised Him and because they used to lie.&quot;&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;&lt;br /&gt;&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;14. Allah Hates Falsehood&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;📖 Surah An-Nahl (16:105)&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;&lt;br /&gt;&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;إِنَّمَا يَفْتَرِي الْكَذِبَ الَّذِينَ لَا يُؤْمِنُونَ بِآيَاتِ اللَّهِ وَأُو۟لَٰئِكَ هُمُ ٱلْكَٰذِبُونَ&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;&lt;br /&gt;&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;🔹 &quot;Only those who do not believe in the verses of Allah fabricate falsehood, and it is they who are the liars.&quot;&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;&lt;br /&gt;&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;15. Spreading Falsehood Corrupts Society&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;📖 Surah Al-Baqarah (2:191)&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;&lt;br /&gt;&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;وَالْفِتْنَةُ أَشَدُّ مِنَ الْقَتْلِ&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;&lt;br /&gt;&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;🔹 &quot;And fitnah (spreading corruption and falsehood) is worse than killing.&quot;&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;&lt;br /&gt;&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;16. Lying Leads to Destruction&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;📖 Surah Al-Ankabut (29:3)&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;&lt;br /&gt;&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;وَلَقَدْ فَتَنَّا الَّذِينَ مِنْ قَبْلِهِمْ ۖ فَلَيَعْلَمَنَّ اللَّهُ الَّذِينَ صَدَقُوا وَلَيَعْلَمَنَّ الْكَاذِبِينَ&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;&lt;br /&gt;&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;🔹 &quot;And We have certainly tested those before them, and Allah will surely make evident those who are truthful, and He will surely make evident the liars.&quot;&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;&lt;br /&gt;&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;17. Spreading False News Causes Harm&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;📖 Surah Al-Ahzab (33:58)&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;&lt;br /&gt;&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;وَالَّذِينَ يُؤْذُونَ الْمُؤْمِنِينَ وَالْمُؤْمِنَاتِ بِغَيْرِ مَا اكْتَسَبُوا فَقَدِ احْتَمَلُوا بُهْتَانًا وَإِثْمًا مُبِينًا&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;&lt;br /&gt;&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;🔹 &quot;And those who harm believing men and believing women for [something] other than what they have earned have certainly borne upon themselves a slander and manifest sin.&quot;&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;&lt;br /&gt;&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;18. False Speech Leads to Hellfire&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;📖 Surah Al-Mutaffifin (83:10-12)&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;&lt;br /&gt;&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;وَيْلٌ يَوْمَئِذٍ لِلْمُكَذِّبِينَ ۝ الَّذِينَ يُكَذِّبُونَ بِيَوْمِ الدِّينِ ۝ وَمَا يُكَذِّبُ بِهِ إِلَّا كُلُّ مُعْتَدٍ أَثِيمٍ&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;&lt;br /&gt;&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;🔹 &quot;Woe that Day to the deniers, those who deny the Day of Recompense. And none denies it except every sinful transgressor.&quot;&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;&lt;br /&gt;&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;background-color: #b6d7a8; color: #2b00fe;&quot;&gt;19. Hypocrites Spread Falsehood&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;background-color: #b6d7a8; color: #2b00fe;&quot;&gt;📖 Surah Al-Munafiqun (63:4)&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;background-color: #b6d7a8; color: #2b00fe;&quot;&gt;&lt;br /&gt;&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;background-color: #b6d7a8; color: #2b00fe;&quot;&gt;وَإِذَا رَأَيْتَهُمْ تُعْجِبُكَ أَجْسَامُهُمْ ۖ وَإِنْ يَقُولُوا تَسْمَعْ لِقَوْلِهِمْ ۖ كَأَنَّهُمْ خُشُبٌ مُسَنَّدَةٌ ۖ يَحْسَبُونَ كُلَّ صَيْحَةٍ عَلَيْهِمْ ۚ هُمُ الْعَدُوُّ فَاحْذَرْهُمْ ۚ قَاتَلَهُمُ اللَّهُ ۖ أَنَّىٰ يُؤْفَكُونَ&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;&lt;br /&gt;&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;🔹 &quot;And when you see them, their forms please you, and if they speak, you listen to their speech. [They are] like pieces of wood propped up – they think that every shout is against them. They are the enemy, so beware of them. May Allah destroy them; how are they deluded?&quot;&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;&lt;br /&gt;&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;20. Allah Commands Speaking the Truth&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;📖 Surah Al-An’am (6:152)&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;&lt;br /&gt;&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;وَإِذَا قُلْتُمْ فَاعْدِلُوا وَلَوْ كَانَ ذَا قُرْبَىٰ&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;&lt;br /&gt;&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;🔹 &quot;And when you speak, be just, even if it concerns a near relative.&quot;&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;&lt;br /&gt;&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;Conclusion&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;The Quran provides a strong warning against spreading misinformation, lying, and distorting the truth. It encourages:&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;✅ Speaking truthfully and justly&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;✅ Verifying information before spreading it&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;✅ Avoiding false accusations and rumors&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;&lt;br /&gt;&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;Would you like more explanations or interpretations of any of these verses? 😊&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;br /&gt;&lt;/p&gt;&lt;p&gt;I tried the same prompts that made Google Gemini hallucinate; the output to the first prompt was correct.&lt;/p&gt;&lt;table align=&quot;center&quot; cellpadding=&quot;0&quot; cellspacing=&quot;0&quot; class=&quot;tr-caption-container&quot; style=&quot;margin-left: auto; margin-right: auto;&quot;&gt;&lt;tbody&gt;&lt;tr&gt;&lt;td style=&quot;text-align: center;&quot;&gt;&lt;a href=&quot;https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjCtnkJsIhLXozZ5NOAS_MmvBt7w-lklc7PeNXe-6hTaZbAoaoIThECiWLQQgx9DdKqt1IC6D3HWlJE0bt3ZxoE4PcwfBbuUnxrfeDkdbWDboXcg4BqzeETQVMEgbAOQQFth8HFQTQnO18yeksE4hYm0qdzLRKg5T6J35R3j158xNiSRRwpnHyYYEH7XaUY/s919/gpt3.PNG&quot; style=&quot;margin-left: auto; margin-right: auto;&quot;&gt;&lt;img alt=&quot;Correct answers from ChatGPT&quot; border=&quot;0&quot; src=&quot;https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjCtnkJsIhLXozZ5NOAS_MmvBt7w-lklc7PeNXe-6hTaZbAoaoIThECiWLQQgx9DdKqt1IC6D3HWlJE0bt3ZxoE4PcwfBbuUnxrfeDkdbWDboXcg4BqzeETQVMEgbAOQQFth8HFQTQnO18yeksE4hYm0qdzLRKg5T6J35R3j158xNiSRRwpnHyYYEH7XaUY/s16000/gpt3.PNG&quot; title=&quot;Correct answers from ChatGPT&quot; /&gt;&lt;/a&gt;&lt;/td&gt;&lt;/tr&gt;&lt;tr&gt;&lt;td class=&quot;tr-caption&quot; style=&quot;direction: rtl; text-align: center;&quot;&gt;Correct answers from ChatGPT&lt;/td&gt;&lt;/tr&gt;&lt;/tbody&gt;&lt;/table&gt;&lt;p&gt;&lt;br /&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #b45f06;&quot;&gt;Me: Give me Quran verses in Arabic about falsehood&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;ChatGPT:&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;Certainly! Here are several Quranic verses in Arabic about falsehood (الباطل والكذب):&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;&lt;br /&gt;&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;1. Falsehood Will Perish&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;📖 Surah Al-Isra (17:81)&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;&lt;br /&gt;&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;وَقُلْ جَاءَ الْحَقُّ وَزَهَقَ الْبَاطِلُ ۚ إِنَّ الْبَاطِلَ كَانَ زَهُوقًا&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;&lt;br /&gt;&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;🔹 &quot;And say: Truth has come, and falsehood has vanished. Indeed, falsehood is bound to perish.&quot;&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;&lt;br /&gt;&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;2. Lying is a Great Sin&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;📖 Surah Az-Zumar (39:3)&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;&lt;br /&gt;&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;أَلَا لِلَّهِ الدِّينُ الْخَالِصُ ۚ وَالَّذِينَ اتَّخَذُوا مِن دُونِهِ أَوْلِيَاءَ مَا نَعْبُدُهُمْ إِلَّا لِيُقَرِّبُونَا إِلَى اللَّهِ زُلْفَىٰ ۚ إِنَّ اللَّهَ يَحْكُمُ بَيْنَهُمْ فِي مَا هُمْ فِيهِ يَخْتَلِفُونَ ۗ إِنَّ اللَّهَ لَا يَهْدِي مَنْ هُوَ كَاذِبٌ كَفَّارٌ&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;&lt;br /&gt;&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;🔹 &quot;Unquestionably, for Allah is the pure religion. And those who take protectors besides Him [say], 'We only worship them that they may bring us nearer to Allah in position.' Indeed, Allah will judge between them concerning that over which they differ. Indeed, Allah does not guide one who is a liar and disbeliever.&quot;&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;&lt;br /&gt;&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;3. Falsehood Leads to Hellfire&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;📖 Surah Al-Mutaffifin (83:10-12)&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;&lt;br /&gt;&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;وَيْلٌ يَوْمَئِذٍ لِلْمُكَذِّبِينَ ۝ الَّذِينَ يُكَذِّبُونَ بِيَوْمِ الدِّينِ ۝ وَمَا يُكَذِّبُ بِهِ إِلَّا كُلُّ مُعْتَدٍ أَثِيمٍ&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;&lt;br /&gt;&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;🔹 &quot;Woe that Day to the deniers, those who deny the Day of Recompense. And none denies it except every sinful transgressor.&quot;&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;&lt;br /&gt;&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;4. Falsehood Brings Allah’s Curse&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;📖 Surah An-Nahl (16:105)&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;&lt;br /&gt;&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;إِنَّمَا يَفْتَرِي الْكَذِبَ الَّذِينَ لَا يُؤْمِنُونَ بِآيَاتِ اللَّهِ وَأُو۟لَٰئِكَ هُمُ ٱلْكَٰذِبُونَ&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;&lt;br /&gt;&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;🔹 &quot;Only those who do not believe in the verses of Allah fabricate falsehood, and it is they who are the liars.&quot;&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;&lt;br /&gt;&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;5. Do Not Conceal the Truth&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;📖 Surah Al-Baqarah (2:42)&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;&lt;br /&gt;&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;وَلَا تَلْبِسُوا الْحَقَّ بِالْبَاطِلِ وَتَكْتُمُوا الْحَقَّ وَأَنْتُمْ تَعْلَمُونَ&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;&lt;br /&gt;&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;🔹 &quot;And do not mix the truth with falsehood or conceal the truth while you know [it].&quot;&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;&lt;br /&gt;&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;6. False Speech is Forbidden&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;📖 Surah Al-Hajj (22:30)&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;&lt;br /&gt;&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;فَاجْتَنِبُوا الرِّجْسَ مِنَ الْأَوْثَانِ وَاجْتَنِبُوا قَوْلَ الزُّورِ&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;&lt;br /&gt;&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;🔹 &quot;So avoid the impurity of idols and avoid false speech.&quot;&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;&lt;br /&gt;&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;7. The Wicked Spread Falsehood&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;📖 Surah Al-Jathiyah (45:7)&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;&lt;br /&gt;&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;وَيْلٌ لِكُلِّ أَفَّاكٍ أَثِيمٍ&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;&lt;br /&gt;&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;🔹 &quot;Woe to every sinful liar.&quot;&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;&lt;br /&gt;&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;8. Liars Will Be Held Accountable&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;📖 Surah Qaf (50:18)&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;&lt;br /&gt;&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;مَّا يَلْفِظُ مِن قَوْلٍ إِلَّا لَدَيْهِ رَقِيبٌ عَتِيدٌ&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;&lt;br /&gt;&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;🔹 &quot;Man does not utter any word except that with him is an observer prepared [to record].&quot;&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;&lt;br /&gt;&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;9. Do Not Follow Falsehood&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;📖 Surah Al-Furqan (25:72)&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;&lt;br /&gt;&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;وَالَّذِينَ لَا يَشْهَدُونَ الزُّورَ وَإِذَا مَرُّوا بِاللَّغْوِ مَرُّوا كِرَامًا&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;&lt;br /&gt;&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;🔹 &quot;And [the servants of the Most Merciful are] those who do not testify to falsehood, and when they pass near ill speech, they pass by with dignity.&quot;&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;&lt;br /&gt;&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;10. Liars Will Be Disgraced&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;📖 Surah Al-Kahf (18:5)&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;&lt;br /&gt;&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;مَا لَهُم بِهِۦ مِنْ عِلْمٍ وَلَا لِأَبَآئِهِمْ ۚ كَبُرَتْ كَلِمَةً تَخْرُجُ مِنْ أَفْوَٰهِهِمْ ۚ إِن يَقُولُونَ إِلَّا كَذِبًا&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;&lt;br /&gt;&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;🔹 &quot;They have no knowledge of it, nor do their forefathers. Grave is the word that comes out of their mouths; they speak not except a lie.&quot;&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;&lt;br /&gt;&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;Conclusion&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;The Quran strongly condemns falsehood, lies, and deception. It teaches that:&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;✅ Truthfulness is a virtue&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;❌ Falsehood leads to loss and punishment&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;⚠️ Spreading lies and misinformation is a major sin&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;&lt;br /&gt;&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;Would you like more verses or interpretations? 😊&lt;/span&gt;&lt;/p&gt;&lt;p&gt;I asked for additional verses and that's when worse errors began to take place.&lt;/p&gt;&lt;table align=&quot;center&quot; cellpadding=&quot;0&quot; cellspacing=&quot;0&quot; class=&quot;tr-caption-container&quot; style=&quot;margin-left: auto; margin-right: auto;&quot;&gt;&lt;tbody&gt;&lt;tr&gt;&lt;td style=&quot;text-align: center;&quot;&gt;&lt;a href=&quot;https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEitQvkuEz3gv7N6UODYguT_KPmnuCkaG51l4k7n2UvtQ4bUAWP7DkxHP8T5MM7LV1SWarEfDiWKzLl2EUWALPlQ9dk2xGy12SWMINzx1u4ivEeoGaPVV1YMb2ZIQQnzk3kkcdsERsswiBpCfSLRK_1tVZKy_8PO6zf-DK5KswOIZ5ZJdJCNBj9_OD0963Hf/s649/gpt4.PNG&quot; style=&quot;margin-left: auto; margin-right: auto;&quot;&gt;&lt;img alt=&quot;Wrong answers from ChatGPT&quot; border=&quot;0&quot; src=&quot;https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEitQvkuEz3gv7N6UODYguT_KPmnuCkaG51l4k7n2UvtQ4bUAWP7DkxHP8T5MM7LV1SWarEfDiWKzLl2EUWALPlQ9dk2xGy12SWMINzx1u4ivEeoGaPVV1YMb2ZIQQnzk3kkcdsERsswiBpCfSLRK_1tVZKy_8PO6zf-DK5KswOIZ5ZJdJCNBj9_OD0963Hf/s16000/gpt4.PNG&quot; title=&quot;Wrong answers from ChatGPT&quot; /&gt;&lt;/a&gt;&lt;/td&gt;&lt;/tr&gt;&lt;tr&gt;&lt;td class=&quot;tr-caption&quot; style=&quot;text-align: center;&quot;&gt;Wrong answers from ChatGPT&lt;/td&gt;&lt;/tr&gt;&lt;/tbody&gt;&lt;/table&gt;&lt;p&gt;&lt;br /&gt;&lt;/p&gt;&lt;table align=&quot;center&quot; cellpadding=&quot;0&quot; cellspacing=&quot;0&quot; class=&quot;tr-caption-container&quot; style=&quot;margin-left: auto; margin-right: auto;&quot;&gt;&lt;tbody&gt;&lt;tr&gt;&lt;td style=&quot;text-align: center;&quot;&gt;&lt;a href=&quot;https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhOJcQ_sgstszSKxl5F3yZesJ-I0RxqyFZEYNDnarl-qCPeBfVRoOz1lGpBlaa4kIFUUs7PIVHpPOLAAttEGpoH-_esZf6ielu3Mip-qLElQYh_P8iZhJGPp0ZzThfu9SIvA707ZJhEhvWi_yrm_QphCRo-eShkcXD4_VXjghuq4K_xvFT3MQYROjQH4Dek/s791/gpt5.PNG&quot; style=&quot;margin-left: auto; margin-right: auto;&quot;&gt;&lt;img alt=&quot;Wrong answers from ChatGPT&quot; border=&quot;0&quot; src=&quot;https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhOJcQ_sgstszSKxl5F3yZesJ-I0RxqyFZEYNDnarl-qCPeBfVRoOz1lGpBlaa4kIFUUs7PIVHpPOLAAttEGpoH-_esZf6ielu3Mip-qLElQYh_P8iZhJGPp0ZzThfu9SIvA707ZJhEhvWi_yrm_QphCRo-eShkcXD4_VXjghuq4K_xvFT3MQYROjQH4Dek/s16000/gpt5.PNG&quot; title=&quot;Wrong answers from ChatGPT&quot; /&gt;&lt;/a&gt;&lt;/td&gt;&lt;/tr&gt;&lt;tr&gt;&lt;td class=&quot;tr-caption&quot; style=&quot;text-align: center;&quot;&gt;Wrong answers from ChatGPT&lt;/td&gt;&lt;/tr&gt;&lt;/tbody&gt;&lt;/table&gt;&lt;p&gt;&lt;br /&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #b45f06;&quot;&gt;Me: Give me more verses&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;ChatGPT:&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;Certainly! Here are more Quranic verses in Arabic that address falsehood (الباطل) and lying (الكذب):&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;&lt;br /&gt;&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;11. The Falsehood of Idol Worship&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;📖 Surah Al-Ankabut (29:17)&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;&lt;br /&gt;&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;إِنَّمَا تَعْبُدُونَ مِن دُونِ اللَّهِ أَوْثَٰنًا وَتَخْلُقُونَ إِفْكًا ۚ إِنَّ الَّذِينَ تَعْبُدُونَ مِن دُونِ اللَّهِ لَا يَمْلِكُونَ لَكُمْ رِزْقًا فَٱبْتَغُوا۟ عِندَ ٱللَّهِ ٱلرِّزْقَ وَٱعْبُدُوهُ وَٱشْكُرُوا۟ لَهُۥٓ ۖ إِلَيْهِ تُرْجَعُونَ&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;&lt;br /&gt;&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;🔹 &quot;You only worship, besides Allah, idols, and you invent falsehood. Indeed, those you worship besides Allah do not possess [for] you [the ability to provide] sustenance. So seek sustenance from Allah and worship Him and be grateful to Him. To Him you will be returned.&quot;&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;&lt;br /&gt;&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;12. Liars Will Be Exposed on the Day of Judgment&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;📖 Surah Az-Zumar (39:60)&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;&lt;br /&gt;&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;وَيَوْمَ ٱلْقِيَٰمَةِ تَرَى ٱلَّذِينَ كَذَبُوا۟ عَلَى ٱللَّهِ وُجُوهُهُم مُّسْوَدَّةٌ ۚ أَلَيْسَ فِى جَهَنَّمَ مَثْوًىۭ لِّلْمُتَكَبِّرِينَ&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;&lt;br /&gt;&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;🔹 &quot;And on the Day of Resurrection, you will see those who lied about Allah with their faces blackened. Is there not in Hell a residence for the arrogant?&quot;&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;&lt;br /&gt;&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;13. Falsehood is a Tool of the Corrupt&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;📖 Surah Al-Baqarah (2:78-79)&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;&lt;br /&gt;&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;وَمِنْهُمْ أُمِّيُّونَ لَا يَعْلَمُونَ الْكِتَابَ إِلَّا أَمَانِيَّ وَإِنْ هُمْ إِلَّا يَظُنُّونَ ۝ فَوَيْلٌ لِلَّذِينَ يَكْتُبُونَ الْكِتَابَ بِأَيْدِيهِمْ ثُمَّ يَقُولُونَ هَٰذَا مِنْ عِندِ اللَّهِ لِيَشْتَرُوا بِهِ ثَمَنًا قَلِيلًا ۖ فَوَيْلٌ لَهُمْ مِمَّا كَتَبَتْ أَيْدِيهِمْ وَوَيْلٌ لَهُمْ مِمَّا يَكْسِبُونَ&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;&lt;br /&gt;&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;🔹 &quot;And among them are unlettered ones who do not know the Scripture except [indulgement in] wishful thinking, but they are only assuming. So woe to those who write the 'scripture' with their own hands, then say, 'This is from Allah,' in order to exchange it for a small price. Woe to them for what their hands have written, and woe to them for what they earn.&quot;&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;&lt;br /&gt;&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;14. Falsehood Leads to Corruption&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;📖 Surah Al-Ma’idah (5:41)&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;&lt;br /&gt;&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;background-color: #ea9999; color: #2b00fe;&quot;&gt;يَسْمَعُونَ لِلْكَذِبِ أَكَّالُونَ لِلسُّحْتِ&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;&lt;br /&gt;&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;🔹 &quot;They listen to falsehood and consume forbidden gain...&quot;&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;&lt;br /&gt;&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;15. False Oaths Do Not Protect Liars&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;📖 Surah An-Nahl (16:94)&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;&lt;br /&gt;&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;وَلَا تَتَّخِذُوا أَيْمَانَكُمْ دَخَلًا بَيْنَكُمْ فَتَزِلَّ قَدَمٌ بَعْدَ ثُبُوتِهَا وَتَذُوقُوا السُّوءَ بِمَا صَدَدتُّمْ عَن سَبِيلِ اللَّهِ وَلَكُمْ عَذَابٌ عَظِيمٌ&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;&lt;br /&gt;&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;🔹 &quot;And do not take your oaths as a means of deception among yourselves, lest a foot slip after it was firmly planted, and you would taste evil for what you diverted from the way of Allah, and you would have a great punishment.&quot;&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;&lt;br /&gt;&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;16. Satan Uses Falsehood to Mislead&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;📖 Surah Al-A’raf (7:27)&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;&lt;br /&gt;&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;يَٰبَنِىٓ ءَادَمَ لَا يَفْتِنَنَّكُمُ ٱلشَّيْطَٰنُ كَمَآ أَخْرَجَ أَبَوَيْكُم مِّنَ ٱلْجَنَّةِ يَنزِعُ عَنْهُمَا لِبَاسَهُمَا لِيُرِيَهُمَا سَوْءَٰتِهِمَا ۚ إِنَّهُۥ يَرَىٰكُمْ هُوَ وَقَبِيلُهُۥ مِنْ حَيْثُ لَا تَرَوْنَهُمْ ۗ إِنَّا جَعَلْنَا ٱلشَّيَٰطِينَ أَوْلِيَٓاءَ لِلَّذِينَ لَا يُؤْمِنُونَ&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;&lt;br /&gt;&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;🔹 &quot;O children of Adam, let not Satan tempt you as he removed your parents from Paradise, stripping them of their clothing to show them their private parts. Indeed, he sees you, he and his tribe, from where you do not see them. Indeed, We have made the devils allies to those who do not believe.&quot;&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;&lt;br /&gt;&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;17. Falsehood Brings Eternal Regret&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;📖 Surah Al-Kahf (18:103-104)&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;&lt;br /&gt;&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;قُلْ هَلْ نُنَبِّئُكُم بِالْأَخْسَرِينَ أَعْمَالًا ۝ ٱلَّذِينَ ضَلَّ سَعْيُهُمْ فِى ٱلْحَيَوٰةِ ٱلدُّنْيَا وَهُمْ يَحْسَبُونَ أَنَّهُمْ يُحْسِنُونَ صُنْعًا&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;&lt;br /&gt;&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;🔹 &quot;Say, 'Shall We inform you of the greatest losers in respect to [their] deeds? Those whose effort is lost in worldly life, while they think that they are doing well in work.'&quot;&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;&lt;br /&gt;&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;18. False Promises Are a Tool of Hypocrites&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;📖 Surah Al-Munafiqun (63:1)&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;&lt;br /&gt;&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;إِذَا جَاءَكَ ٱلْمُنَٰفِقُونَ قَالُوا۟ نَشْهَدُ إِنَّكَ لَرَسُولُ ٱللَّهِ ۗ وَٱللَّهُ يَعْلَمُ إِنَّكَ لَرَسُولُهُۥ وَٱللَّهُ يَشْهَدُ إِنَّ ٱلْمُنَٰفِقِينَ لَكَٰذِبُونَ&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;&lt;br /&gt;&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;🔹 &quot;When the hypocrites come to you, [O Muhammad], they say, 'We testify that you are the Messenger of Allah.' And Allah knows that you are His Messenger, and Allah testifies that the hypocrites are liars.&quot;&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;&lt;br /&gt;&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;19. Those Who Lie About Allah Will Face Severe Punishment&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;📖 Surah Al-Kahf (18:15)&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;&lt;br /&gt;&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;background-color: #ea9999; color: #2b00fe;&quot;&gt;مَنۢ أَظْلَمُ مِمَّنِ ٱفْتَرَىٰ عَلَى ٱللَّهِ كَذِبًا&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;&lt;br /&gt;&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;🔹 &quot;Who is more unjust than one who invents a lie against Allah?&quot;&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;&lt;br /&gt;&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;20. Justice and Truth Will Prevail&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;📖 Surah Yunus (10:82)&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;&lt;br /&gt;&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;وَيُحِقُّ اللَّهُ الْحَقَّ بِكَلِمَاتِهِ وَلَوْ كَرِهَ الْمُجْرِمُونَ&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;&lt;br /&gt;&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;🔹 &quot;And Allah establishes the truth by His words, even if the criminals dislike it.&quot;&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;&lt;br /&gt;&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;Conclusion&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;The Quran strongly warns against falsehood, deception, and lying. It teaches that:&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;✅ Truth will always triumph&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;❌ Falsehood leads to disgrace and punishment&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;⚠️ Lying is a characteristic of the corrupt and hypocrites&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;&lt;br /&gt;&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;Would you like more explanations or interpretations? 😊&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;br /&gt;&lt;/p&gt;&lt;p&gt;ChatGPT attributed the phrase يَسْمَعُونَ لِلْكَذِبِ أَكَّالُونَ لِلسُّحْتِ to Quran 5:41 in item number 14 in the answer. While this is an incorrect citation and the syntax of the phrase is also incorrect, Quran 5:42 has a very similar phrase سَمَّاعُونَ لِلْكَذِبِ أَكَّالُونَ لِلسُّحْتِ&lt;/p&gt;&lt;p&gt;The phrase in ChatGPT's output translates to: They are listening to lies, devourers of what's forbidden (bribery).&lt;/p&gt;&lt;p&gt;The phrase from Quran 5:42 translates to: Listeners to lies, devourers of what's forbidden (bribery).&lt;/p&gt;&lt;p&gt;I was able to catch this error quickly because I have read the entire Quran multiple times and have seen and heard the correct verse so many times. I absolutely remember it. For someone who haven't seen it or unaware of the wrong syntax, they will not be able to identify this error.&lt;/p&gt;&lt;p&gt;The second error in ChatGPT's output is even worse because the difference is only in one letter. I was only able to identify this error because I checked each and every verse in the output of this entire experiment with an authentic copy of the Quran (word for word). ChatGPT attributed the phrase مَنۢ أَظْلَمُ مِمَّنِ ٱفْتَرَىٰ عَلَى ٱللَّهِ كَذِبًا to Quran 18:15 in item number 19 in the answer. While this is an incorrect citation, the syntax of the phrase is not necessarily incorrect. The only difference between this phrase and a phrase in Quran 18:15 is a deletion of one letter (ف) from the first word in the phrase. The deletion does not change the meaning or make it wrong syntactically since the output is lacking the context (the previous phrase in the same verse). To someone who doesn't memorize this verse word for word, this error cannot be identified.&lt;/p&gt;&lt;p&gt;Quran 18:15 has the phrase فَمَنْ أَظْلَمُ مِمَّنِ افْتَرَىٰ عَلَى اللَّهِ كَذِبًا&lt;/p&gt;&lt;p&gt;The translation of both phrases (the correct one and the wrong one) is:&lt;/p&gt;&lt;p&gt;Who is more evil than the one who fabricates lies and attributes them to GOD?&lt;/p&gt;&lt;p&gt;In January 2025, I was unable to reproduce the errors that Ali-Reza Bhojani and Marcus Schwarting found in &lt;a href=&quot;https://doi.org/10.1080/14746700.2023.2255944&quot; target=&quot;_blank&quot;&gt;2023&lt;/a&gt; in ChatGPT's output. The output of ChatGPT was consistently correct to the prompts mentioned in the paper. I tried to archive the ChatGPT's answers to the prompts I issued. Although SPN said that a snapshot was captured, when trying to retrieve the archived copy, the page said that the Wayback Machine has not archived that &lt;a href=&quot;https://web.archive.org/web/20250201023434/https://chatgpt.com/c/679d6bfa-3a88-8006-aa1f-df16cb29d6e1&quot; target=&quot;_blank&quot;&gt;URL&lt;/a&gt;.&lt;/p&gt;&lt;table align=&quot;center&quot; cellpadding=&quot;0&quot; cellspacing=&quot;0&quot; class=&quot;tr-caption-container&quot; style=&quot;margin-left: auto; margin-right: auto;&quot;&gt;&lt;tbody&gt;&lt;tr&gt;&lt;td style=&quot;text-align: center;&quot;&gt;&lt;a href=&quot;https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgSAbfRF0jOiqeFywZUbtKx071Mu18zyJY4v8zvd2JvATOaWVcEMZATAjsSiUCUF2HR81y90Lzq0KR-mUwh-SpvfuNmU9zX3XsC3y_SPJXg07OJchJakO5t01F9wx9AJrWgHmaHHRU3U4F3E6ec375nk0iCFjRQoVbr0LcU4v-E-1CCixDSs0qIxMDZEgZx/s1345/wm.PNG&quot; style=&quot;margin-left: auto; margin-right: auto;&quot;&gt;&lt;img alt=&quot;The Internet Archive was unable to archive chat with ChatGPT&quot; border=&quot;0&quot; src=&quot;https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgSAbfRF0jOiqeFywZUbtKx071Mu18zyJY4v8zvd2JvATOaWVcEMZATAjsSiUCUF2HR81y90Lzq0KR-mUwh-SpvfuNmU9zX3XsC3y_SPJXg07OJchJakO5t01F9wx9AJrWgHmaHHRU3U4F3E6ec375nk0iCFjRQoVbr0LcU4v-E-1CCixDSs0qIxMDZEgZx/s16000/wm.PNG&quot; title=&quot;The Internet Archive was unable to archive chat with ChatGPT&quot; /&gt;&lt;/a&gt;&lt;/td&gt;&lt;/tr&gt;&lt;tr&gt;&lt;td class=&quot;tr-caption&quot; style=&quot;text-align: center;&quot;&gt;The Internet Archive was unable to archive chat with ChatGPT&lt;/td&gt;&lt;/tr&gt;&lt;/tbody&gt;&lt;/table&gt;&lt;p&gt;&lt;br /&gt;&lt;/p&gt;&lt;p&gt;It is possible to archive the entire chat(s) with ChatGPT if the user created a public link (shareable) for the chat and submitted it to archive.is&lt;/p&gt;&lt;table align=&quot;center&quot; cellpadding=&quot;0&quot; cellspacing=&quot;0&quot; class=&quot;tr-caption-container&quot; style=&quot;margin-left: auto; margin-right: auto;&quot;&gt;&lt;tbody&gt;&lt;tr&gt;&lt;td style=&quot;text-align: center;&quot;&gt;&lt;a href=&quot;https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEghj-pdtlEi3XfvcDqnuOYqGej92SiCxAaIpj1LgcCDfpJcKzvKlnP017gHc7ypTX5qoxRQuVJJb5w4Yv4dk7vnUg0FoIfx2IoBEtZw3l8nWsJ5GBA9109v1slCzcCVoBB3LQcfaFWdrOuf_wBhH4WmagYUjHUOcve5r08bdm8pwM3YUWDFSbxtZ3AIIsT5/s522/share1.PNG&quot; style=&quot;margin-left: auto; margin-right: auto;&quot;&gt;&lt;img alt=&quot;Creating a public link to chat with ChatGPT&quot; border=&quot;0&quot; src=&quot;https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEghj-pdtlEi3XfvcDqnuOYqGej92SiCxAaIpj1LgcCDfpJcKzvKlnP017gHc7ypTX5qoxRQuVJJb5w4Yv4dk7vnUg0FoIfx2IoBEtZw3l8nWsJ5GBA9109v1slCzcCVoBB3LQcfaFWdrOuf_wBhH4WmagYUjHUOcve5r08bdm8pwM3YUWDFSbxtZ3AIIsT5/s16000/share1.PNG&quot; title=&quot;Creating a public link to chat with ChatGPT&quot; /&gt;&lt;/a&gt;&lt;/td&gt;&lt;/tr&gt;&lt;tr&gt;&lt;td class=&quot;tr-caption&quot; style=&quot;text-align: center;&quot;&gt;Creating a public link to chat with ChatGPT&lt;/td&gt;&lt;/tr&gt;&lt;/tbody&gt;&lt;/table&gt;&lt;p&gt;&lt;br /&gt; &lt;/p&gt;&lt;table align=&quot;center&quot; cellpadding=&quot;0&quot; cellspacing=&quot;0&quot; class=&quot;tr-caption-container&quot; style=&quot;margin-left: auto; margin-right: auto;&quot;&gt;&lt;tbody&gt;&lt;tr&gt;&lt;td style=&quot;text-align: center;&quot;&gt;&lt;a href=&quot;https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiRnyu_Jfusa2qDMDN1kgnEBdzigcV1DRV3KF8IOAvCOuIklvcHbOwa6oShdOdxd5_BV9GVn90OqvII8mQvczhKsqOXLaQDQUDBbwpDVwmFZrWwhu8bRrBc2YZSvPbf0q1QAF2fk0Cm9UpZmihJFXPb3OFY4CQTgq0nIRpiG5sQEUGlNyUCHiq8d71v93OV/s513/share2.PNG&quot; style=&quot;margin-left: auto; margin-right: auto;&quot;&gt;&lt;img alt=&quot;The created/updated link from ChatGPT to be shared and archived&quot; border=&quot;0&quot; src=&quot;https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiRnyu_Jfusa2qDMDN1kgnEBdzigcV1DRV3KF8IOAvCOuIklvcHbOwa6oShdOdxd5_BV9GVn90OqvII8mQvczhKsqOXLaQDQUDBbwpDVwmFZrWwhu8bRrBc2YZSvPbf0q1QAF2fk0Cm9UpZmihJFXPb3OFY4CQTgq0nIRpiG5sQEUGlNyUCHiq8d71v93OV/s16000/share2.PNG&quot; title=&quot;The created/updated link from ChatGPT to be shared and archived&quot; /&gt;&lt;/a&gt;&lt;/td&gt;&lt;/tr&gt;&lt;tr&gt;&lt;td class=&quot;tr-caption&quot; style=&quot;text-align: center;&quot;&gt;The created/updated link from ChatGPT to be shared and archived&lt;/td&gt;&lt;/tr&gt;&lt;/tbody&gt;&lt;/table&gt;&lt;p&gt;&lt;br /&gt;&lt;/p&gt;&lt;p&gt;I archived &lt;a href=&quot;https://archive.ph/LqHYP&quot; target=&quot;_blank&quot;&gt;my chat with ChatGPT on archive.is&lt;/a&gt;.&lt;/p&gt;&lt;div class=&quot;separator&quot; style=&quot;clear: both; text-align: center;&quot;&gt;&lt;a href=&quot;https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEh-Kx_6gHLiixTNAnoQRgQoPxHREzm8TEdA8y6pdAymbeQVKuOXTSVuNGm-GgeUp-_GwkiAebcZ6z3ctezyfJ7xWlwZWpQVY1qH4VjN7wyfqOW7dKz52bwuMglRpoOiDeNXhCRyUQL3ed43GDnwpRDLlUWD_YNmUfwORJJlytHX-InpaNfEE01VNVP6BCKP/s937/is1.PNG&quot; style=&quot;margin-left: 1em; margin-right: 1em;&quot;&gt;&lt;img alt=&quot;Replay of my chat with ChatGPT on archive.is&quot; border=&quot;0&quot; src=&quot;https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEh-Kx_6gHLiixTNAnoQRgQoPxHREzm8TEdA8y6pdAymbeQVKuOXTSVuNGm-GgeUp-_GwkiAebcZ6z3ctezyfJ7xWlwZWpQVY1qH4VjN7wyfqOW7dKz52bwuMglRpoOiDeNXhCRyUQL3ed43GDnwpRDLlUWD_YNmUfwORJJlytHX-InpaNfEE01VNVP6BCKP/s16000/is1.PNG&quot; title=&quot;Replay of my chat with ChatGPT on archive.is&quot; /&gt;&lt;/a&gt;&lt;/div&gt;&lt;p&gt;&lt;br /&gt;&lt;/p&gt;&lt;p&gt;Note that this method of archiving the chat with ChatGPT does not work with the Internet Archive. It looks like the page has been saved using SPN, but the page does not replay.&lt;/p&gt;&lt;table align=&quot;center&quot; cellpadding=&quot;0&quot; cellspacing=&quot;0&quot; class=&quot;tr-caption-container&quot; style=&quot;margin-left: auto; margin-right: auto;&quot;&gt;&lt;tbody&gt;&lt;tr&gt;&lt;td style=&quot;text-align: center;&quot;&gt;&lt;a href=&quot;https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiWBR_RK2Nfqi4rfZPRmoCiBr3t0mAFcLAKHQApNSCvy-ErmpNaEc6qRhGnOTrtr83OHXdoxWSF54PIpQ0W9TyDGyYN8rZGDARqOeNJU7bV9eF0nWVVlgMn8ZN1QQZABTgD1l7Vh26n4eJmlHEDlshH1lc7DQ3sC5iJbKHrU_HCNeKe265Oi7bhuvh9VuH9/s964/ia1.PNG&quot; style=&quot;margin-left: auto; margin-right: auto;&quot;&gt;&lt;img alt=&quot;Using SPN to archive my chat with ChatGPT&quot; border=&quot;0&quot; src=&quot;https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiWBR_RK2Nfqi4rfZPRmoCiBr3t0mAFcLAKHQApNSCvy-ErmpNaEc6qRhGnOTrtr83OHXdoxWSF54PIpQ0W9TyDGyYN8rZGDARqOeNJU7bV9eF0nWVVlgMn8ZN1QQZABTgD1l7Vh26n4eJmlHEDlshH1lc7DQ3sC5iJbKHrU_HCNeKe265Oi7bhuvh9VuH9/s16000/ia1.PNG&quot; title=&quot;Using SPN to archive my chat with ChatGPT&quot; /&gt;&lt;/a&gt;&lt;/td&gt;&lt;/tr&gt;&lt;tr&gt;&lt;td class=&quot;tr-caption&quot; style=&quot;text-align: center;&quot;&gt;Using SPN to archive my chat with ChatGPT&lt;/td&gt;&lt;/tr&gt;&lt;/tbody&gt;&lt;/table&gt;&lt;p&gt;&lt;br /&gt;&lt;/p&gt;&lt;table align=&quot;center&quot; cellpadding=&quot;0&quot; cellspacing=&quot;0&quot; class=&quot;tr-caption-container&quot; style=&quot;margin-left: auto; margin-right: auto;&quot;&gt;&lt;tbody&gt;&lt;tr&gt;&lt;td style=&quot;text-align: center;&quot;&gt;&lt;a href=&quot;https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEidWzIVs7vcFRn5lVIzzg3JprCmAMWIAnchF6yYAj9ZGzxKwzTIDNcVnKvGkY3E6qHuO46SlwijEAsR-TnPeck_fPO5hEDqvl7H0EIpKh6MLCxS1dkhxT_TVRgrfZ1MQhBPU3a6XQSgBEmRVakXhN9xovfWsdzZRNIxWtPkloW8ZkvcZFlX3GDFc7hbotIy/s953/ia2.PNG&quot; style=&quot;margin-left: auto; margin-right: auto;&quot;&gt;&lt;img alt=&quot;The Wayback Machine has not archived the link I submitted for archiving&quot; border=&quot;0&quot; src=&quot;https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEidWzIVs7vcFRn5lVIzzg3JprCmAMWIAnchF6yYAj9ZGzxKwzTIDNcVnKvGkY3E6qHuO46SlwijEAsR-TnPeck_fPO5hEDqvl7H0EIpKh6MLCxS1dkhxT_TVRgrfZ1MQhBPU3a6XQSgBEmRVakXhN9xovfWsdzZRNIxWtPkloW8ZkvcZFlX3GDFc7hbotIy/s16000/ia2.PNG&quot; title=&quot;The Wayback Machine has not archived the link I submitted for archiving&quot; /&gt;&lt;/a&gt;&lt;/td&gt;&lt;/tr&gt;&lt;tr&gt;&lt;td class=&quot;tr-caption&quot; style=&quot;text-align: center;&quot;&gt;The Wayback Machine has not archived the link I &lt;a href=&quot;https://web.archive.org/web/20250201023434/https://chatgpt.com/web/20250201023434/https://chatgpt.com/c/679d6bfa-3a88-8006-aa1f-df16cb29d6e1&quot; target=&quot;_blank&quot;&gt;submitted for archiving&lt;/a&gt;&lt;/td&gt;&lt;/tr&gt;&lt;/tbody&gt;&lt;/table&gt;&lt;h2&gt;ChatGPT cannot provide accurate citations from the Quran in Arabic&lt;/h2&gt;&lt;p&gt;From this experiment, I found that ChatGPT &quot;improvement&quot; is similar to that of Google Gemini (from &lt;a href=&quot;https://ws-dl.blogspot.com/2025/05/2025-01-25-part-1-large-language-models.html&quot; target=&quot;_blank&quot;&gt;part 1 of this post&lt;/a&gt;). They both did not produce errors for the same prompts issued in the paper in 2023. The results were all correct citations with no errors, however, the answers had wrong citations to correct verses in the Quran for different prompts that have the same meaning. ChatGPT, as well as Google Gemini, provided phrases that do not exist in the Quran (mixtures of words that exist in the Quran and elsewhere) and attributed them to the Quran. Some of these phrases included syntax errors and altered verses attributing them to the Quran which is problematic for Muslims who believe that the Quran is free from errors and modifications. &lt;/p&gt;&lt;p&gt;After two years and going from ChatGPT-3.5 to ChatGPT-4o, the errors and hallucinations remain present when using slightly different prompts. Some researchers argue that LLMs' errors and hallucinations not only will remain present in all LLMs, but they will increase as more and more LLMs' generated content make it to the training datasets (content on the internet), and result in a &lt;a href=&quot;https://arxiv.org/abs/2305.17493&quot; target=&quot;_blank&quot;&gt;total collapse of the model&lt;/a&gt;.&lt;/p&gt;&lt;h2&gt;Conclusions&lt;/h2&gt;&lt;div&gt;LLMs have taken the world by surprise in the last few years with the introduction of chatbots like OpenAI’s ChatGPT. LLMs have been misused in the past and they will continue to be misused by some users. Misinformation is one of the toughest problems with LLMs. In this post, we demonstrated that ChatGPT's hallucinations continue to prevent it from being able to provide correct Quranic citations. Unfortunaqtely, Wrong ChatGPT's answers to questions about the Quran are presented in a Quranic-like format which makes users believe them to be correct references to the Quran. ChatGPT also mixes correct and incorrect Quranic citations making it harder for non-experts to separate right from wrong answers. The mixture of incorrect and correct Quran citations is not accepted by Muslims who believe the Quran to be free from errors and should never be modified. Some researchers argue that misinformation cannot be eliminated from LLMs because training datasets will continue to have both misinformation and disinformation.&lt;/div&gt;&lt;div&gt;&lt;br /&gt;&lt;/div&gt;&lt;div&gt;--&lt;a href=&quot;https://x.com/Hussam_A_Hallak&quot; target=&quot;_blank&quot;&gt;Hussam Hallak&lt;/a&gt;&lt;/div&gt;</content>
  2472.      <author>
  2473.  <name>Hussam Hallak (noreply@blogger.com)</name>
  2474. <uri>https://ws-dl.blogspot.com/</uri>
  2475.      </author>
  2476.    </entry>
  2477.  
  2478.    <entry>
  2479.      <title>Web Science and Digital Libraries (WS-DL) Group at Old Dominion University: 2025-05-06: Part 3 - Large Language Models (LLMs) are hallucinating in Arabic about the Quran  (DeepSeek)</title>
  2480.      <link rel="alternate" type="text/html" href="https://ws-dl.blogspot.com/2025/05/2025-05-06-part-3-large-language-models.html"/>
  2481.      <id>tag:blogger.com,1999:blog-953024975153422094.post-1758190156233622895</id>
  2482.      <updated>2025-06-13T16:56:43+00:00</updated>
  2483.      <content type="html">&lt;div style=&quot;text-align: left;&quot;&gt;&lt;a href=&quot;https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhwPwlcfmvK-AaoURdZGUpzlkaOFwXnQntY7Bt4O5_PsKDmN6xeNeg8WXw3k28iwkLiQC17WiNLmzi4pArXnaiFmfY6kK2ey-SkR_hDtIV7UFtjX3j_UvY51MYgY3gUvlEuXvLZg5H7yq4EQr-ASbTGkKTI4otNKutcVpgO4ZXURygvNJ1vAd82yCjwtFUf/s1266/Capture.PNG&quot; style=&quot;margin-left: 1em; margin-right: 1em; text-align: center;&quot;&gt;&lt;img border=&quot;0&quot; src=&quot;https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhwPwlcfmvK-AaoURdZGUpzlkaOFwXnQntY7Bt4O5_PsKDmN6xeNeg8WXw3k28iwkLiQC17WiNLmzi4pArXnaiFmfY6kK2ey-SkR_hDtIV7UFtjX3j_UvY51MYgY3gUvlEuXvLZg5H7yq4EQr-ASbTGkKTI4otNKutcVpgO4ZXURygvNJ1vAd82yCjwtFUf/s16000/Capture.PNG&quot; /&gt;&lt;/a&gt;&lt;/div&gt;&lt;div style=&quot;text-align: left;&quot;&gt;&lt;br /&gt;&lt;/div&gt;&lt;h2 style=&quot;text-align: left;&quot;&gt;Introduction&lt;/h2&gt;&lt;div style=&quot;text-align: left;&quot;&gt;By the time I finished trying to reproduce the results in &lt;a href=&quot;https://doi.org/10.1080/14746700.2023.2255944&quot; target=&quot;_blank&quot;&gt;the paper I reviewed&lt;/a&gt; in &lt;a href=&quot;https://ws-dl.blogspot.com/2025/05/2025-01-25-part-1-large-language-models.html&quot; target=&quot;_blank&quot;&gt;Part 1&lt;/a&gt; and &lt;a href=&quot;https://ws-dl.blogspot.com/2025/05/2025-05-06-part-2-large-language-models.html&quot; target=&quot;_blank&quot;&gt;Part 2&lt;/a&gt; of this blog post, &lt;a href=&quot;https://www.deepseek.com/&quot; target=&quot;_blank&quot;&gt;DeepSeek&lt;/a&gt; released its first free chatbot app DeepSeek-V3 on January 20th 2025. I could not fight the urge to see how it compares to Open AI's ChatGPT and Google Gemini. The purpose of this experiment is to provide a &lt;a href=&quot;https://studywell.com/proof/disproof-by-counterexample/&quot; target=&quot;_blank&quot;&gt;disproof by counter example&lt;/a&gt;, that, contrary to popular belief, LLMs are not capable of producing error-free answers to questions. I am using prompts to find Arabic verses in the Quran on misinformation.&lt;/div&gt;&lt;div style=&quot;text-align: left;&quot;&gt;&lt;br /&gt;&lt;/div&gt;&lt;div style=&quot;text-align: left;&quot;&gt;I repeated the same experiment I did with Google Gemini and ChatGPT-4o in January 2025; the results were not better. In addition to being slower, I kept getting the annoying message &quot;The server is busy message. Please try again later.&quot; which I didn't try to find a solution for because the service got restored when I waited and tried again later. For prompts tested in the paper, DeepSeek's answers to the first prompt (What does the Quran have to say about misinformation?) were consistent and correct. No incorrect or made-up verses attributed to the Quran. Similarly, no errors in the answers when I asked this question to &quot;What does the Quran have to say about misinformation? please cite verses in Arabic&quot; to see how DeepSeek performs citing Quranic verses in Arabic. DeepSeek correctly cited Quran 2:42, 49:6, 24:15-16, 17:36, 50:18, 25:72, and 49:12 two times in a row. However, when I asked for additional verses, one of DeepSeek's presented phrases was correct, but it attributed the phrase to a different verse in a different chapter in the Quran. &lt;/div&gt;&lt;div&gt;&lt;div&gt;&lt;br /&gt;&lt;/div&gt;&lt;div&gt;&lt;span style=&quot;color: #783f04;&quot;&gt;Me: Give me additional verses&lt;/span&gt;&lt;/div&gt;&lt;div&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;DeepSeek: Okay, the user has already asked about what the Quran says regarding misinformation and received several detailed responses with cited verses. Now they're asking for additional verses. I need to make sure I don't repeat the ones already mentioned. Let me recall the previous answers and check which verses were covered.&lt;/span&gt;&lt;/div&gt;&lt;div&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;In the last interaction, I listed seven key verses in Arabic with translations. So the user now wants more verses beyond those. Let me think of other relevant Quranic verses that address truthfulness, lying, false testimony, or spreading false information.&lt;/span&gt;&lt;/div&gt;&lt;div&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;Surah Al-Isra (17:35) comes to mind about fulfilling measures and testifying with justice. Then Surah Al-Ma'idah (5:8) talks about standing for justice even against oneself. Surah Al-Mu'minun (23:3) mentions avoiding vain talk. Surah Al-Hajj (22:30) advises avoiding false speech. Surah Al-Baqarah (2:283) discusses trust and testimony. Surah Al-Nahl (16:105) specifies that liars disbelieve in Allah's signs. Surah Al-Mutaffifin (83:1-3) condemns fraud. Surah Al-An'am (6:144) warns against speaking without knowledge. Surah Al-Nisa (4:112) talks about bearing sin for false accusations.&lt;/span&gt;&lt;/div&gt;&lt;div&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;I should present these new verses in Arabic with transliteration and translation, ensuring they haven't been mentioned before. Also, the user might be looking for a comprehensive list for study or reference, so context and lessons are important. Need to organize them clearly, maybe with headings for each verse. Let me verify the references to avoid errors. Once checked, structure the response with each verse, its Arabic text, transliteration, translation, context, and a key lesson. Conclude by summarizing the Quran's stance on truth and integrity. Make sure the formatting is consistent with previous answers for clarity.&lt;/span&gt;&lt;/div&gt;&lt;div&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;Certainly! Here are additional Quranic verses in Arabic that emphasize truthfulness, condemn falsehood, and guide ethical communication, along with translations and contexts:&lt;/span&gt;&lt;/div&gt;&lt;div&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;________________________________________&lt;/span&gt;&lt;/div&gt;&lt;div&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;1. Prohibition of False Testimony&lt;/span&gt;&lt;/div&gt;&lt;div&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;Quran 17:35&lt;/span&gt;&lt;/div&gt;&lt;div&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;Arabic:&lt;/span&gt;&lt;/div&gt;&lt;div&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;وَأَوْفُوا الْكَيْلَ إِذَا كِلْتُمْ وَزِنُوا بِالْقِسْطَاسِ الْمُسْتَقِيمِ ۚ ذَٰلِكَ خَيْرٌ وَأَحْسَنُ تَأْوِيلًا&lt;/span&gt;&lt;/div&gt;&lt;div&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;Transliteration: Wa awfū al-kayla idhā kiltum wa zīnū bil-qisṭāsi al-mustaqīm. Dhālika khayrun wa aḥsanu ta’wīlā.&lt;/span&gt;&lt;/div&gt;&lt;div&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;Translation: &quot;Give full measure when you measure, and weigh with an even balance. That is best and fairest in the end.&quot;&lt;/span&gt;&lt;/div&gt;&lt;div&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;Context: Condemns dishonesty in speech and action, urging fairness and integrity.&lt;/span&gt;&lt;/div&gt;&lt;div&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;________________________________________&lt;/span&gt;&lt;/div&gt;&lt;div&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;2. Command to Uphold Justice&lt;/span&gt;&lt;/div&gt;&lt;div&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;Quran 5:8&lt;/span&gt;&lt;/div&gt;&lt;div&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;Arabic:&lt;/span&gt;&lt;/div&gt;&lt;div&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;يَا أَيُّهَا الَّذِينَ آمَنُوا كُونُوا قَوَّامِينَ لِلَّهِ شُهَدَاءَ بِالْقِسْطِ ۖ وَلَا يَجْرِمَنَّكُمْ شَنَآنُ قَوْمٍ عَلَىٰ أَلَّا تَعْدِلُوا ۚ اعْدِلُوا هُوَ أَقْرَبُ لِلتَّقْوَىٰ&lt;/span&gt;&lt;/div&gt;&lt;div&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;Transliteration: Yā ayyuhalladhīna āmanū kūnū qawwāmīna lillāhi shuhadā’a bil-qisṭi wa lā yajrimannakum sha’nānu qawmin ʿalā allā taʿdilū. Iʿdilū huwa aqrabu lil-taqwā.&lt;/span&gt;&lt;/div&gt;&lt;div&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;Translation: &quot;O believers! Stand firm for Allah as witnesses for justice, and do not let hatred of a people lead you to injustice. Be just—that is closer to righteousness.&quot;&lt;/span&gt;&lt;/div&gt;&lt;div&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;Context: Commands justice in speech and testimony, even toward enemies.&lt;/span&gt;&lt;/div&gt;&lt;div&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;________________________________________&lt;/span&gt;&lt;/div&gt;&lt;div&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;3. Avoiding Vain Speech&lt;/span&gt;&lt;/div&gt;&lt;div&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;Quran 23:3&lt;/span&gt;&lt;/div&gt;&lt;div&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;Arabic:&lt;/span&gt;&lt;/div&gt;&lt;div&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;وَالَّذِينَ هُمْ عَنِ اللَّغْوِ مُعْرِضُونَ&lt;/span&gt;&lt;/div&gt;&lt;div&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;Transliteration: Wa alladhīna hum ʿani al-laghwi muʿriḍūn.&lt;/span&gt;&lt;/div&gt;&lt;div&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;Translation: &quot;And those who turn away from ill speech.&quot;&lt;/span&gt;&lt;/div&gt;&lt;div&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;Context: Praises those who reject gossip, slander, and meaningless talk.&lt;/span&gt;&lt;/div&gt;&lt;div&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;________________________________________&lt;/span&gt;&lt;/div&gt;&lt;div&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;4. Prohibition of False Speech&lt;/span&gt;&lt;/div&gt;&lt;div&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;Quran 22:30&lt;/span&gt;&lt;/div&gt;&lt;div&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;Arabic:&lt;/span&gt;&lt;/div&gt;&lt;div&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;وَاجْتَنِبُوا قَوْلَ الزُّورِ&lt;/span&gt;&lt;/div&gt;&lt;div&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;Transliteration: Wa ijtānibū qawla al-zūr.&lt;/span&gt;&lt;/div&gt;&lt;div&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;Translation: &quot;And shun false speech.&quot;&lt;/span&gt;&lt;/div&gt;&lt;div&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;Context: Part of a broader command to avoid all forms of sin, including lies.&lt;/span&gt;&lt;/div&gt;&lt;div&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;________________________________________&lt;/span&gt;&lt;/div&gt;&lt;div&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;5. Condemnation of Breaching Trusts&lt;/span&gt;&lt;/div&gt;&lt;div&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;Quran 2:283&lt;/span&gt;&lt;/div&gt;&lt;div&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;Arabic:&lt;/span&gt;&lt;/div&gt;&lt;div&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;وَلَا تَكْتُمُوا الشَّهَادَةَ ۚ وَمَن يَكْتُمْهَا فَإِنَّهُ آثِمٌ قَلْبُهُ&lt;/span&gt;&lt;/div&gt;&lt;div&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;Transliteration: Wa lā taktumū al-shahādah. Wa man yaktumhā fa-innahu āthimun qalbuh.&lt;/span&gt;&lt;/div&gt;&lt;div&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;Translation: &quot;Do not conceal testimony, for whoever conceals it has a sinful heart.&quot;&lt;/span&gt;&lt;/div&gt;&lt;div&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;Context: Warns against withholding or distorting truthful testimony.&lt;/span&gt;&lt;/div&gt;&lt;div&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;________________________________________&lt;/span&gt;&lt;/div&gt;&lt;div&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;6. Truthfulness as a Mark of Faith&lt;/span&gt;&lt;/div&gt;&lt;div&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;Quran 16:105&lt;/span&gt;&lt;/div&gt;&lt;div&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;Arabic:&lt;/span&gt;&lt;/div&gt;&lt;div&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;إِنَّمَا يَفْتَرِي الْكَذِبَ الَّذِينَ لَا يُؤْمِنُونَ بِآيَاتِ اللَّهِ&lt;/span&gt;&lt;/div&gt;&lt;div&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;Transliteration: Innamā yaftarī al-kaḏhaba alladhīna lā yu’minūna bi-āyāti Allāh.&lt;/span&gt;&lt;/div&gt;&lt;div&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;Translation: &quot;Only those who disbelieve in Allah’s signs fabricate lies.&quot;&lt;/span&gt;&lt;/div&gt;&lt;div&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;Context: Links lying to a lack of faith, emphasizing truthfulness as a hallmark of belief.&lt;/span&gt;&lt;/div&gt;&lt;div&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;________________________________________&lt;/span&gt;&lt;/div&gt;&lt;div&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;7. Warning Against Fraud and Deception&lt;/span&gt;&lt;/div&gt;&lt;div&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;Quran 83:1–3&lt;/span&gt;&lt;/div&gt;&lt;div&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;Arabic:&lt;/span&gt;&lt;/div&gt;&lt;div&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;وَيْلٌ لِّلْمُطَفِّفِينَ. الَّذِينَ إِذَا اكْتَالُوا عَلَى النَّاسِ يَسْتَوْفُونَ. وَإِذَا كَالُوهُمْ أَو وَّزَنُوهُمْ يُخْسِرُونَ&lt;/span&gt;&lt;/div&gt;&lt;div&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;Transliteration: Waylun lil-muṭaffifīn. Alladhīna idhā aktālū ʿalā al-nāsi yastawfūn. Wa idhā kālūhum aw wazanūhum yukhsiṛūn.&lt;/span&gt;&lt;/div&gt;&lt;div&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;Translation: &quot;Woe to those who give less [than due], who demand full measure from others but give less when they measure or weigh [for them].&quot;&lt;/span&gt;&lt;/div&gt;&lt;div&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;Context: Condemns dishonesty in transactions, a form of misinformation for gain.&lt;/span&gt;&lt;/div&gt;&lt;div&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;________________________________________&lt;/span&gt;&lt;/div&gt;&lt;div&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;8. Rejecting Baseless Accusations&lt;/span&gt;&lt;/div&gt;&lt;div&gt;&lt;span style=&quot;background-color: #ea9999; color: #2b00fe;&quot;&gt;Quran 6:144&lt;/span&gt;&lt;/div&gt;&lt;div&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;Arabic:&lt;/span&gt;&lt;/div&gt;&lt;div&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;وَمَن يَكْسِبْ خَطِيئَةً أَوْ إِثْمًا ثُمَّ يَرْمِ بِهِ بَرِيئًا فَقَدِ احْتَمَلَ بُهْتَانًا وَإِثْمًا مُّبِينًا&lt;/span&gt;&lt;/div&gt;&lt;div&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;Transliteration: Wa man yaksib khaṭī’atan aw ithman thumma yarmi bihi barī’an faqadi iḥtamala buhtānan wa ithman mubīnā.&lt;/span&gt;&lt;/div&gt;&lt;div&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;Translation: &quot;Whoever commits a sin or error and then blames it on an innocent person has burdened himself with slander and a blatant sin.&quot;&lt;/span&gt;&lt;/div&gt;&lt;div&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;Context: Forbids shifting blame through lies or false accusations.&lt;/span&gt;&lt;/div&gt;&lt;div&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;________________________________________&lt;/span&gt;&lt;/div&gt;&lt;div&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;9. Command to Speak Kindly and Truthfully&lt;/span&gt;&lt;/div&gt;&lt;div&gt;&lt;span style=&quot;background-color: #93c47d; color: #2b00fe;&quot;&gt;Quran 4:112&lt;/span&gt;&lt;/div&gt;&lt;div&gt;&lt;span style=&quot;background-color: #93c47d; color: #2b00fe;&quot;&gt;Arabic:&lt;/span&gt;&lt;/div&gt;&lt;div&gt;&lt;span style=&quot;background-color: #93c47d; color: #2b00fe;&quot;&gt;وَمَن يَكْسِبْ خَطِيئَةً أَوْ إِثْمًا ثُمَّ يَرْمِ بِهِ بَرِيئًا فَقَدِ احْتَمَلَ بُهْتَانًا وَإِثْمًا مُّبِينًا&lt;/span&gt;&lt;/div&gt;&lt;div&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;Transliteration: Wa man yaksib khaṭī’atan aw ithman thumma yarmi bihi barī’an faqadi iḥtamala buhtānan wa ithman mubīnā.&lt;/span&gt;&lt;/div&gt;&lt;div&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;Translation: &quot;Whoever commits a sin or error and then blames it on an innocent person has borne a grave slander and a manifest sin.&quot;&lt;/span&gt;&lt;/div&gt;&lt;div&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;Context: Reiterates the gravity of false accusations.&lt;/span&gt;&lt;/div&gt;&lt;div&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;________________________________________&lt;/span&gt;&lt;/div&gt;&lt;div&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;Key Lessons&lt;/span&gt;&lt;/div&gt;&lt;div&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;•&lt;span style=&quot;white-space: pre;&quot;&gt; &lt;/span&gt;Integrity in Speech: The Quran repeatedly ties truthfulness to faith (e.g., 16:105).&lt;/span&gt;&lt;/div&gt;&lt;div&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;•&lt;span style=&quot;white-space: pre;&quot;&gt; &lt;/span&gt;Social Justice: Lies harm individuals and communities (2:283, 83:1–3).&lt;/span&gt;&lt;/div&gt;&lt;div&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;•&lt;span style=&quot;white-space: pre;&quot;&gt; &lt;/span&gt;Divine Accountability: Every word and intention is recorded (50:18).&lt;/span&gt;&lt;/div&gt;&lt;div&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;•&lt;span style=&quot;white-space: pre;&quot;&gt; &lt;/span&gt;Rejecting Harm: Silence in the face of falsehood is complicity (24:15–16).&lt;/span&gt;&lt;/div&gt;&lt;div&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;These verses collectively establish a robust ethical framework for combating misinformation, urging believers to prioritize truth, fairness, and responsibility in all communication.&lt;/span&gt;&lt;/div&gt;&lt;div&gt;&lt;br /&gt;&lt;/div&gt;&lt;div&gt;Item number eight states that the phrase&lt;/div&gt;&lt;div&gt;وَمَن يَكْسِبْ خَطِيئَةً أَوْ إِثْمًا ثُمَّ يَرْمِ بِهِ بَرِيئًا فَقَدِ احْتَمَلَ بُهْتَانًا وَإِثْمًا مُّبِينًا&lt;/div&gt;&lt;div&gt;is from Quran 6:144 but it is actually the entire verse Quran 4:112&lt;/div&gt;&lt;div&gt;&lt;br /&gt;&lt;/div&gt;&lt;div&gt;The correct verse from Quran 6:144 is:&lt;/div&gt;&lt;div&gt;وَمِنَ ٱلۡإِبِلِ ٱثۡنَيۡنِ وَمِنَ ٱلۡبَقَرِ ٱثۡنَيۡنِ‌ۗ قُلۡ ءَآلذَّڪَرَيۡنِ حَرَّمَ أَمِ ٱلۡأُنثَيَيۡنِ أَمَّا ٱشۡتَمَلَتۡ عَلَيۡهِ أَرۡحَامُ ٱلۡأُنثَيَيۡنِ‌ۖ أَمۡ ڪُنتُمۡ شُہَدَآءَ إِذۡ وَصَّٮٰڪُمُ ٱللَّهُ بِهَـٰذَا‌ۚ فَمَنۡ أَظۡلَمُ مِمَّنِ ٱفۡتَرَىٰ عَلَى ٱللَّهِ ڪَذِبً۬ا لِّيُضِلَّ ٱلنَّاسَ بِغَيۡرِ عِلۡمٍ‌ۗ إِنَّ ٱللَّهَ لَا يَہۡدِى ٱلۡقَوۡمَ ٱلظَّـٰلِمِينَ &lt;/div&gt;&lt;div&gt;&lt;br /&gt;&lt;/div&gt;&lt;div&gt;The translation is:&lt;/div&gt;&lt;div&gt;And of the camels two (male and female), and of oxen two (male and female). Say: &quot;Has He forbidden the two males or the two females or (the young) which the wombs of the two females enclose? Or were you present when Allâh ordered you such a thing? Then who does more wrong than one who invents a lie against Allâh, to lead mankind astray without knowledge. Certainly Allâh guides not the people who are Zâlimûn (polytheists and wrong-doers).&quot;&lt;/div&gt;&lt;div&gt;&lt;br /&gt;&lt;/div&gt;&lt;div&gt;It is interesting that item number nine in the answer, right below number eight, is the same verse but correctly referenced this time.&lt;/div&gt;&lt;div&gt;&lt;br /&gt;&lt;/div&gt;&lt;div&gt;&lt;table align=&quot;center&quot; cellpadding=&quot;0&quot; cellspacing=&quot;0&quot; class=&quot;tr-caption-container&quot; style=&quot;margin-left: auto; margin-right: auto;&quot;&gt;&lt;tbody&gt;&lt;tr&gt;&lt;td style=&quot;text-align: center;&quot;&gt;&lt;a href=&quot;https://blogger.googleusercontent.com/img/a/AVvXsEiwUMebHdvDl19wxjL3DySoJEQZeUdBmkwqjIUuNhE0kt7La4idT6QFI8dK4qtUMn6jrbCpcMWvGnzPud_X_jh0Fpetl9RzcvuFu5oSWxzVOBne1E3BCcQaD10Ac7zfr6xtQOzzl7lJcjXrOB1JZ09WDoRjMeUh6xpBNEQZjWN0N0pA0x9Pp___Wi2iJk3F&quot; style=&quot;margin-left: auto; margin-right: auto;&quot;&gt;&lt;img alt=&quot;Wrong chapter and verse number from DeepSeek&quot; src=&quot;https://blogger.googleusercontent.com/img/a/AVvXsEiwUMebHdvDl19wxjL3DySoJEQZeUdBmkwqjIUuNhE0kt7La4idT6QFI8dK4qtUMn6jrbCpcMWvGnzPud_X_jh0Fpetl9RzcvuFu5oSWxzVOBne1E3BCcQaD10Ac7zfr6xtQOzzl7lJcjXrOB1JZ09WDoRjMeUh6xpBNEQZjWN0N0pA0x9Pp___Wi2iJk3F=s16000&quot; title=&quot;Wrong chapter and verse number from DeepSeek&quot; /&gt;&lt;/a&gt;&lt;/td&gt;&lt;/tr&gt;&lt;tr&gt;&lt;td class=&quot;tr-caption&quot; style=&quot;text-align: center;&quot;&gt;Wrong chapter and verse number from DeepSeek&lt;/td&gt;&lt;/tr&gt;&lt;/tbody&gt;&lt;/table&gt;&lt;br /&gt;I asked DeepSeek for additional verses, and this time, DeepSeek started to hallucinate. The answer included a phrase that does not exist in the Quran at all, but DeepSeek attributed the phrase to a specific verse in the Quran.&lt;/div&gt;&lt;div&gt;&lt;br /&gt;&lt;/div&gt;&lt;div&gt;&lt;span style=&quot;color: #783f04;&quot;&gt;Me: Give me additional verses&lt;/span&gt;&lt;/div&gt;&lt;div&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;Thought for 37 seconds&lt;/span&gt;&lt;/div&gt;&lt;div&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;DeepSeek: Okay, the user has asked for additional Quranic verses related to misinformation again. Let me see. They previously requested verses in Arabic with translations, and I provided several. Now they want more.&lt;/span&gt;&lt;/div&gt;&lt;div&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;First, I need to recall which verses I've already listed. The previous ones included 2:42, 49:6, 24:15-16, 17:36, 50:18, 25:72, 49:12, and then in the last response, I added 17:35, 5:8, 23:3, 22:30, 2:283, 16:105, 83:1-3, 6:144, 4:112. So I need to find more verses that haven't been mentioned yet but still address truthfulness, falsehood, gossip, or accountability in speech.&lt;/span&gt;&lt;/div&gt;&lt;div&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;Let me check other relevant verses. Surah Al-Ma'idah 5:2 comes to mind about cooperation in righteousness. Also, Surah Al-Isra 17:34 about fulfilling promises. Surah Al-Hujurat 49:11 prohibits insulting others. Surah Al-Mutaffifin 83:29-36 talks about the consequences for those who mock believers. Surah Al-Baqarah 2:188 warns against deceit in property. Surah Al-Ahzab 33:70-71 encourages truthful speech. Surah Al-Mumtahanah 60:12 on not hiding testimony. Surah Al-Nahl 16:116 against lying about what's permissible. And Surah Al-Tawbah 9:119 urges being with the truthful.&lt;/span&gt;&lt;/div&gt;&lt;div&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;I need to present these in the same format as before: Arabic, transliteration, translation, and context. Make sure the translations are accurate and the contexts clearly relate to misinformation or ethical communication. Also, check if any of these were already covered. For example, 2:283 was mentioned before, so I should avoid repetition. Wait, 2:188 is about property, which is a form of deceit, so it's relevant. Similarly, Surah Al-Tawbah 9:119 is about being with the truthful, which supports truthfulness in general.&lt;/span&gt;&lt;/div&gt;&lt;div&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;Ensure that each verse adds value to the topic. For instance, Surah Al-Ma'idah 5:2 encourages cooperation in good deeds and avoiding sin, which ties into promoting truth and avoiding falsehood. Surah Al-Isra 17:34 about fulfilling promises relates to honesty. Surah Al-Hujurat 49:11 is about not insulting, which is a form of harmful speech.&lt;/span&gt;&lt;/div&gt;&lt;div&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;Double-check the translations for accuracy. Maybe use Sahih International or another reputable translation. Also, verify the Arabic script to ensure it's correct. Organize them in a logical order, perhaps by theme. Start with cooperation in righteousness, then fulfilling promises, avoiding insults, mocking others, deceit in transactions, truthful speech, hiding testimony, lying about halal/haram, and the call to be with the truthful.&lt;/span&gt;&lt;/div&gt;&lt;div&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;Finally, summarize the key lessons from these additional verses to reinforce the Quran's stance on truthfulness and against misinformation. Make sure the response is clear and helpful, providing the user with meaningful content that expands on the previous answers without redundancy.&lt;/span&gt;&lt;/div&gt;&lt;div&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;Certainly! Here are additional Quranic verses in Arabic that emphasize truthfulness, condemn deception, and guide ethical communication, along with translations and contexts:&lt;/span&gt;&lt;/div&gt;&lt;div&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;________________________________________&lt;/span&gt;&lt;/div&gt;&lt;div&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;1. Cooperation in Righteousness, Not Sin&lt;/span&gt;&lt;/div&gt;&lt;div&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;Quran 5:2&lt;/span&gt;&lt;/div&gt;&lt;div&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;Arabic:&lt;/span&gt;&lt;/div&gt;&lt;div&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;وَتَعَاوَنُوا عَلَى الْبِرِّ وَالتَّقْوَىٰ ۖ وَلَا تَعَاوَنُوا عَلَى الْإِثْمِ وَالْعُدْوَانِ&lt;/span&gt;&lt;/div&gt;&lt;div&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;Transliteration: Wa taʿāwanū ʿalā al-birri wa al-taqwā wa lā taʿāwanū ʿalā al-ithmi wa al-ʿudwān.&lt;/span&gt;&lt;/div&gt;&lt;div&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;Translation: &quot;Cooperate in righteousness and piety, and do not cooperate in sin and aggression.&quot;&lt;/span&gt;&lt;/div&gt;&lt;div&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;Context: Encourages collective efforts to uphold truth and justice while rejecting complicity in falsehood or harm.&lt;/span&gt;&lt;/div&gt;&lt;div&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;________________________________________&lt;/span&gt;&lt;/div&gt;&lt;div&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;2. Fulfilling Promises and Trusts&lt;/span&gt;&lt;/div&gt;&lt;div&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;Quran 17:34&lt;/span&gt;&lt;/div&gt;&lt;div&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;Arabic:&lt;/span&gt;&lt;/div&gt;&lt;div&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;وَأَوْفُوا بِالْعَهْدِ ۖ إِنَّ الْعَهْدَ كَانَ مَسْئُولًا&lt;/span&gt;&lt;/div&gt;&lt;div&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;Transliteration: Wa awfū bil-ʿahdi inna al-ʿahda kāna mas’ūlā.&lt;/span&gt;&lt;/div&gt;&lt;div&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;Translation: &quot;Fulfill your promises, for you will surely be held accountable for them.&quot;&lt;/span&gt;&lt;/div&gt;&lt;div&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;Context: Highlights the importance of honesty in commitments, a form of truthful communication.&lt;/span&gt;&lt;/div&gt;&lt;div&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;________________________________________&lt;/span&gt;&lt;/div&gt;&lt;div&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;3. Prohibition of Insulting Others&lt;/span&gt;&lt;/div&gt;&lt;div&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;Quran 49:11&lt;/span&gt;&lt;/div&gt;&lt;div&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;Arabic:&lt;/span&gt;&lt;/div&gt;&lt;div&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;وَلَا تَلْمِزُوا أَنفُسَكُمْ وَلَا تَنَابَزُوا بِالْأَلْقَابِ&lt;/span&gt;&lt;/div&gt;&lt;div&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;Transliteration: Wa lā talmizū anfusakum wa lā tanāabazū bil-alqāb.&lt;/span&gt;&lt;/div&gt;&lt;div&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;Translation: &quot;Do not insult one another or call each other by offensive nicknames.&quot;&lt;/span&gt;&lt;/div&gt;&lt;div&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;Context: Forbids demeaning speech, which can fuel division and false narratives.&lt;/span&gt;&lt;/div&gt;&lt;div&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;________________________________________&lt;/span&gt;&lt;/div&gt;&lt;div&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;4. Condemnation of Mockery&lt;/span&gt;&lt;/div&gt;&lt;div&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;Quran 83:29–36&lt;/span&gt;&lt;/div&gt;&lt;div&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;Arabic:&lt;/span&gt;&lt;/div&gt;&lt;div&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;إِنَّ الَّذِينَ أَجْرَمُوا كَانُوا مِنَ الَّذِينَ آمَنُوا يَضْحَكُونَ. وَإِذَا مَرُّوا بِهِمْ يَتَغَامَزُونَ&lt;/span&gt;&lt;/div&gt;&lt;div&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;Transliteration: Inna alladhīna ajramū kānū mina alladhīna āmanū yaḍḥakūn. Wa idhā marrū bihim yataghāmazūn.&lt;/span&gt;&lt;/div&gt;&lt;div&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;Translation: &quot;Indeed, the wicked used to laugh at the believers, winking [mockingly] when they passed by them.&quot;&lt;/span&gt;&lt;/div&gt;&lt;div&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;Context: Condemns mockery and belittling others, behaviors that often accompany misinformation.&lt;/span&gt;&lt;/div&gt;&lt;div&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;________________________________________&lt;/span&gt;&lt;/div&gt;&lt;div&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;5. Prohibition of Deceit in Transactions&lt;/span&gt;&lt;/div&gt;&lt;div&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;Quran 2:188&lt;/span&gt;&lt;/div&gt;&lt;div&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;Arabic:&lt;/span&gt;&lt;/div&gt;&lt;div&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;وَلَا تَأْكُلُوا أَمْوَالَكُم بَيْنَكُم بِالْبَاطِلِ&lt;/span&gt;&lt;/div&gt;&lt;div&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;Transliteration: Wa lā ta’kulū amwālakum baynakum bil-bāṭil.&lt;/span&gt;&lt;/div&gt;&lt;div&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;Translation: &quot;Do not consume one another’s wealth unjustly.&quot;&lt;/span&gt;&lt;/div&gt;&lt;div&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;Context: Forbids deceitful practices, including lying for financial gain.&lt;/span&gt;&lt;/div&gt;&lt;div&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;________________________________________&lt;/span&gt;&lt;/div&gt;&lt;div&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;6. Command to Speak Truthfully&lt;/span&gt;&lt;/div&gt;&lt;div&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;Quran 33:70–71&lt;/span&gt;&lt;/div&gt;&lt;div&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;Arabic:&lt;/span&gt;&lt;/div&gt;&lt;div&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;يَا أَيُّهَا الَّذِينَ آمَنُوا اتَّقُوا اللَّهَ وَقُولُوا قَوْلًا سَدِيدًا. يُصْلِحْ لَكُمْ أَعْمَالَكُمْ&lt;/span&gt;&lt;/div&gt;&lt;div&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;Transliteration: Yā ayyuhalladhīna āmanū ittaqū Allāha wa qūlū qawlan sadīdā. Yuṣliḥ lakum aʿmālakum.&lt;/span&gt;&lt;/div&gt;&lt;div&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;Translation: &quot;O believers! Fear Allah and speak uprightly. He will mend your deeds for you.&quot;&lt;/span&gt;&lt;/div&gt;&lt;div&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;Context: Links truthful speech to divine guidance and rectification of actions.&lt;/span&gt;&lt;/div&gt;&lt;div&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;________________________________________&lt;/span&gt;&lt;/div&gt;&lt;div&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;7. Rejecting Hidden Testimony&lt;/span&gt;&lt;/div&gt;&lt;div&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;Quran 60:12&lt;/span&gt;&lt;/div&gt;&lt;div&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;Arabic:&lt;/span&gt;&lt;/div&gt;&lt;div&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;يَا أَيُّهَا النَّبِيُّ إِذَا جَاءَكَ الْمُؤْمِنَاتُ يُبَايِعْنَكَ عَلَىٰ أَن لَّا يُشْرِكْنَ بِاللَّهِ... &lt;span style=&quot;background-color: #ea9999;&quot;&gt;وَلَا يَكْتُمْنَ شَهَادَةً&lt;/span&gt;&lt;/span&gt;&lt;/div&gt;&lt;div&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;Transliteration: Yā ayyuha al-nabiyyu idhā jā’aka al-mu’minātu yubāyiʿnaka ʿalā an lā yush’rikna billāh... Wa lā yaktumna shahādah.&lt;/span&gt;&lt;/div&gt;&lt;div&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;Translation: &quot;O Prophet! When believing women pledge allegiance to you... [they must] not conceal testimony.&quot;&lt;/span&gt;&lt;/div&gt;&lt;div&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;Context: Requires transparency in testimony, rejecting the suppression of truth.&lt;/span&gt;&lt;/div&gt;&lt;div&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;________________________________________&lt;/span&gt;&lt;/div&gt;&lt;div&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;8. Condemnation of False Oaths&lt;/span&gt;&lt;/div&gt;&lt;div&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;Quran 16:116&lt;/span&gt;&lt;/div&gt;&lt;div&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;Arabic:&lt;/span&gt;&lt;/div&gt;&lt;div&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;وَلَا تَقُولُوا لِمَا تَصِفُ أَلْسِنَتُكُمُ الْكَذِبَ هَٰذَا حَلَالٌ وَهَٰذَا حَرَامٌ&lt;/span&gt;&lt;/div&gt;&lt;div&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;Transliteration: Wa lā taqūlū limā taṣifu alsinatukumu al-kaḏiba hādhā ḥalālun wa hādhā ḥarām.&lt;/span&gt;&lt;/div&gt;&lt;div&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;Translation: &quot;Do not say about what your tongues falsely claim, ‘This is lawful, and this is forbidden.’&quot;&lt;/span&gt;&lt;/div&gt;&lt;div&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;Context: Condemns lying to legitimize or prohibit matters falsely.&lt;/span&gt;&lt;/div&gt;&lt;div&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;________________________________________&lt;/span&gt;&lt;/div&gt;&lt;div&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;9. The Call to Be with the Truthful&lt;/span&gt;&lt;/div&gt;&lt;div&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;Quran 9:119&lt;/span&gt;&lt;/div&gt;&lt;div&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;Arabic:&lt;/span&gt;&lt;/div&gt;&lt;div&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;يَا أَيُّهَا الَّذِينَ آمَنُوا اتَّقُوا اللَّهَ وَكُونُوا مَعَ الصَّادِقِينَ&lt;/span&gt;&lt;/div&gt;&lt;div&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;Transliteration: Yā ayyuhalladhīna āmanū ittaqū Allāha wa kūnū maʿa al-ṣādiqīn.&lt;/span&gt;&lt;/div&gt;&lt;div&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;Translation: &quot;O believers! Fear Allah and be with those who are true.&quot;&lt;/span&gt;&lt;/div&gt;&lt;div&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;Context: Urges alignment with truthful people, rejecting circles of falsehood.&lt;/span&gt;&lt;/div&gt;&lt;div&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;________________________________________&lt;/span&gt;&lt;/div&gt;&lt;div&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;Key Lessons from These Verses&lt;/span&gt;&lt;/div&gt;&lt;div&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;1.&lt;span style=&quot;white-space: pre;&quot;&gt; &lt;/span&gt;Collective Responsibility: Uphold truth collectively (5:2) and avoid complicity in lies.&lt;/span&gt;&lt;/div&gt;&lt;div&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;2.&lt;span style=&quot;white-space: pre;&quot;&gt; &lt;/span&gt;Integrity in Transactions: Deceit for gain is forbidden (2:188).&lt;/span&gt;&lt;/div&gt;&lt;div&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;3.&lt;span style=&quot;white-space: pre;&quot;&gt; &lt;/span&gt;Dignity in Speech: Mockery, insults, and belittling others are condemned (49:11, 83:29–36).&lt;/span&gt;&lt;/div&gt;&lt;div&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;4.&lt;span style=&quot;white-space: pre;&quot;&gt; &lt;/span&gt;Transparency: Concealing testimony or distorting facts violates trust (60:12, 16:116).&lt;/span&gt;&lt;/div&gt;&lt;div&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;5.&lt;span style=&quot;white-space: pre;&quot;&gt; &lt;/span&gt;Divine Alignment: Truthfulness aligns believers with God’s guidance (33:70–71).&lt;/span&gt;&lt;/div&gt;&lt;div&gt;&lt;span style=&quot;color: #2b00fe;&quot;&gt;These verses reinforce the Quran’s comprehensive framework for combating misinformation, emphasizing that truth is a divine mandate and lies are a moral and spiritual corruption. In an age of misinformation, these teachings call for vigilance, integrity, and collective accountability.&lt;/span&gt;&lt;/div&gt;&lt;div&gt;&lt;br /&gt;&lt;/div&gt;&lt;div&gt;Item number four in the answer is a reference to two correct verses from Quran 83:29–30, but DeepSeek attributed them to Quran 83:29–36.&lt;/div&gt;&lt;div&gt;&lt;br /&gt;&lt;/div&gt;&lt;div&gt;Item number seven in the answer attributed the phrase:&lt;/div&gt;&lt;div&gt;يَا أَيُّهَا النَّبِيُّ إِذَا جَاءَكَ الْمُؤْمِنَاتُ يُبَايِعْنَكَ عَلَىٰ أَن لَّا يُشْرِكْنَ بِاللَّهِ... وَلَا يَكْتُمْنَ شَهَادَةً&lt;/div&gt;&lt;div&gt;to Quran 60:12&lt;/div&gt;&lt;div&gt;The first sentence exists in Quran 60:12 but the second sentence, after the three dots, does not exist in the Quran at all. It translates to &quot;And them (believing women) will not conceal a testimony&quot;.&lt;/div&gt;&lt;div&gt;&lt;br /&gt;&lt;/div&gt;&lt;div&gt;The correct Quran 60:12 is:&lt;/div&gt;&lt;div&gt;يَـٰٓأَيُّہَا ٱلنَّبِىُّ إِذَا جَآءَكَ ٱلۡمُؤۡمِنَـٰتُ يُبَايِعۡنَكَ عَلَىٰٓ أَن لَّا يُشۡرِكۡنَ بِٱللَّهِ شَيۡـًٔ۬ا وَلَا يَسۡرِقۡنَ وَلَا يَزۡنِينَ وَلَا يَقۡتُلۡنَ أَوۡلَـٰدَهُنَّ وَلَا يَأۡتِينَ بِبُهۡتَـٰنٍ۬ يَفۡتَرِينَهُ ۥ بَيۡنَ أَيۡدِيہِنَّ وَأَرۡجُلِهِنَّ وَلَا يَعۡصِينَكَ فِى مَعۡرُوفٍ۬‌ۙ فَبَايِعۡهُنَّ وَٱسۡتَغۡفِرۡ لَهُنَّ ٱللَّهَ‌ۖ إِنَّ ٱللَّهَ غَفُورٌ۬ رَّحِيمٌ۬ &lt;/div&gt;&lt;div&gt;The translation is:&lt;/div&gt;&lt;div&gt;O Prophet! When believing women come to you to give you the Bai'âh (pledge), that they will not associate anything in worship with Allah, that they will not steal, that they will not commit illegal sexual intercourse, that they will not kill their children, that they will not utter slander, intentionally forging falsehood (i.e. by making illegal children belonging to their husbands), and that they will not disobey you in Ma'rûf (Islâmic Monotheism and all that which Islâm ordains) then accept their Bai'âh (pledge), and ask Allâh to forgive them, Verily, Allâh is Oft-Forgiving, Most Merciful.&lt;/div&gt;&lt;div&gt;&lt;br /&gt;&lt;/div&gt;&lt;div&gt;&lt;table align=&quot;center&quot; cellpadding=&quot;0&quot; cellspacing=&quot;0&quot; class=&quot;tr-caption-container&quot; style=&quot;margin-left: auto; margin-right: auto;&quot;&gt;&lt;tbody&gt;&lt;tr&gt;&lt;td style=&quot;text-align: center;&quot;&gt;&lt;a href=&quot;https://blogger.googleusercontent.com/img/a/AVvXsEi2-RYEP5xtFjGFrzL2R4e2uRJR2xxmNx7xdnbHfftDul9hLK2_PriMiyrnbdoFuA8zzL8x8kONkbvVvWgWJghDhugERXICSRX4cY-_9qb8YMJ2p1zxSgApzTaBn774YfqFqm6i_joNj9DuJVmJUAhV-qsWl37SSD6JDexHJCzpq6vwku4oa49zfKncgGtQ&quot; style=&quot;margin-left: auto; margin-right: auto;&quot;&gt;&lt;img alt=&quot;Wrong answer from DeepSeek&quot; src=&quot;https://blogger.googleusercontent.com/img/a/AVvXsEi2-RYEP5xtFjGFrzL2R4e2uRJR2xxmNx7xdnbHfftDul9hLK2_PriMiyrnbdoFuA8zzL8x8kONkbvVvWgWJghDhugERXICSRX4cY-_9qb8YMJ2p1zxSgApzTaBn774YfqFqm6i_joNj9DuJVmJUAhV-qsWl37SSD6JDexHJCzpq6vwku4oa49zfKncgGtQ=s16000&quot; title=&quot;Wrong answer from DeepSeek&quot; /&gt;&lt;/a&gt;&lt;/td&gt;&lt;/tr&gt;&lt;tr&gt;&lt;td class=&quot;tr-caption&quot; style=&quot;text-align: center;&quot;&gt;Wrong answer from DeepSeek&lt;/td&gt;&lt;/tr&gt;&lt;/tbody&gt;&lt;/table&gt;&lt;br /&gt;I stopped asking DeepSeek at this point because after performing this experiment on Google Gemini, ChatGPT, and now DeepSeek and getting the same type of errors and hallucinations, I am convinced that LLMs are not capable of correctly quoting verses from the Quran in Arabic. In addition, archiving my questions and DeepSeek's answers by public web archives is not possible because &lt;a href=&quot;https://ws-dl.blogspot.com/2018/07/2018-07-18-why-we-need-private-web.html&quot; target=&quot;_blank&quot;&gt;the chat page is behind a login wall which makes it not archivable&lt;/a&gt;.&lt;/div&gt;&lt;h2 style=&quot;text-align: left;&quot;&gt;Conclusions&lt;/h2&gt;&lt;div&gt;I initially was looking to study LLMs' improvements over the past two years quoting verses from the Quran. Although testing the same prompts in the paper did not produce any errors, slightly changing the prompts while keeping the same meaning resulted in producing errors and hallucinations in the answers using all three tools (Google Gemini, ChatGPT, and DeepSeek). Some researchers warned that misinformation and disinformation cannot be eliminated from LLMs output. They argue that LLMs' errors will continue to increase and result in a &lt;a href=&quot;https://arxiv.org/abs/2305.17493&quot; target=&quot;_blank&quot;&gt;total collapse of the model&lt;/a&gt;.&lt;/div&gt;&lt;/div&gt;&lt;div&gt;&lt;br /&gt;&lt;/div&gt;&lt;div&gt;The following table outlines the errors each model produced and the experiment year:&lt;/div&gt;&lt;div&gt;&lt;br /&gt;&lt;/div&gt;&lt;div&gt;&lt;table align=&quot;center&quot; cellpadding=&quot;0&quot; cellspacing=&quot;0&quot; class=&quot;tr-caption-container&quot; style=&quot;margin-left: auto; margin-right: auto;&quot;&gt;&lt;tbody&gt;&lt;tr&gt;&lt;td style=&quot;text-align: center;&quot;&gt;&lt;a href=&quot;https://blogger.googleusercontent.com/img/a/AVvXsEi6worGztbnm51WTX-fBReuWtmT8rP_TbGJfsyuBUHYKZq5Nchpbznaz4dTiff2ZVGUb5IxDikoup7dgTHD0FwcV5qA7ivMwJe3K7rcYdZqupIw_EmO6sco3bXn6VnuDp7mWMjkBddzICe1XuFRYzL8OjI8kmdEJPWpFNDOQUujS2mHxnkWRGRfW6PRgp3C&quot; style=&quot;margin-left: auto; margin-right: auto;&quot;&gt;&lt;img alt=&quot;Errors produced by Google Gemini, ChatGPT, and DeepSeek in 2023 and 2025&quot; height=&quot;142&quot; src=&quot;https://blogger.googleusercontent.com/img/a/AVvXsEi6worGztbnm51WTX-fBReuWtmT8rP_TbGJfsyuBUHYKZq5Nchpbznaz4dTiff2ZVGUb5IxDikoup7dgTHD0FwcV5qA7ivMwJe3K7rcYdZqupIw_EmO6sco3bXn6VnuDp7mWMjkBddzICe1XuFRYzL8OjI8kmdEJPWpFNDOQUujS2mHxnkWRGRfW6PRgp3C=w641-h142&quot; title=&quot;Errors produced by Google Gemini, ChatGPT, and DeepSeek in 2023 and 2025&quot; width=&quot;641&quot; /&gt;&lt;/a&gt;&lt;/td&gt;&lt;/tr&gt;&lt;tr&gt;&lt;td class=&quot;tr-caption&quot; style=&quot;text-align: center;&quot;&gt;Types of errors produced by Google Gemini, ChatGPT, and DeepSeek&lt;/td&gt;&lt;/tr&gt;&lt;/tbody&gt;&lt;/table&gt;&lt;/div&gt;&lt;div&gt;&lt;br /&gt;&lt;div&gt;Retrieval-Augmented Generation (RAG) might be a promising technique that can significantly reduce errors and hallucinations in LLMs, especially for tasks that involve citing Quranic verses since the Quran has a fixed structure of 114 chapters (Surahs) and a set number of verses (Ayahs). This textual stability makes it an ideal candidate for RAG. The model doesn't need to &quot;guess&quot; or &quot;infer&quot; the content of a verse; it can retrieve the exact text. Therefore, for citing Quranic verses, it is possible that RAG LLM can significantly enhance the reliability and trustworthiness of AI-generated responses involving Quranic citations.&lt;/div&gt;&lt;/div&gt;&lt;div&gt;&lt;br /&gt;&lt;/div&gt;&lt;div&gt;--&lt;a href=&quot;https://x.com/Hussam_A_Hallak&quot; target=&quot;_blank&quot;&gt;Hussam Hallak&lt;/a&gt;&lt;/div&gt;</content>
  2484.      <author>
  2485.  <name>Hussam Hallak (noreply@blogger.com)</name>
  2486. <uri>https://ws-dl.blogspot.com/</uri>
  2487.      </author>
  2488.    </entry>
  2489.  
  2490.    <entry>
  2491.      <title>Journal of Web Librarianship: The More Things Change: Tracking Social Media Platform Shifts in Academic Libraries (2019–2025)</title>
  2492.      <link rel="alternate" type="text/html" href="https://www.tandfonline.com/doi/full/10.1080/19322909.2025.2502930?ai=1dl&amp;mi=co84bk&amp;af=R"/>
  2493.      <id>/doi/full/10.1080/19322909.2025.2502930?ai=1dl&amp;mi=co84bk&amp;af=R</id>
  2494.      <updated>2025-06-13T08:46:41+00:00</updated>
  2495.      <content type="html">. &lt;br /&gt;</content>
  2496.      <author>
  2497.  <name>Holly M. Surbaugh University Libraries, University of New Mexico, Albuquerque, New Mexico, USAHolly M. Surbaugh serves as an Undergraduate Engagement Librarian in the Learning, Research, and Engagement department of the University of New Mexico Libraries. She has previously worked in both public libraries and as a research analyst and data, information, and knowledge management consultant. She earned a master’s degree in library science from the University of Maryland’s iSchool and a bachelor’s degree from the University of Missouri School of Journalism.</name>
  2498. <uri>https://www.tandfonline.com/loi/wjwl20?ai=1dl&mi=co84bk&af=R</uri>
  2499.      </author>
  2500.    </entry>
  2501.  
  2502.    <entry>
  2503.      <title>Journal of Web Librarianship: Creating online tutorials: A practical guide for librarians (2nd ed.)</title>
  2504.      <link rel="alternate" type="text/html" href="https://www.tandfonline.com/doi/full/10.1080/19322909.2025.2515371?ai=1dl&amp;mi=co84bk&amp;af=R"/>
  2505.      <id>/doi/full/10.1080/19322909.2025.2515371?ai=1dl&amp;mi=co84bk&amp;af=R</id>
  2506.      <updated>2025-06-13T08:45:57+00:00</updated>
  2507.      <content type="html">. &lt;br /&gt;</content>
  2508.      <author>
  2509.  <name>Alyshia Bagley Collection Services Department, Georgia Southern University, Statesboro, GA, USA</name>
  2510. <uri>https://www.tandfonline.com/loi/wjwl20?ai=1dl&mi=co84bk&af=R</uri>
  2511.      </author>
  2512.    </entry>
  2513.  
  2514. </feed>
  2515.  
Copyright © 2002-9 Sam Ruby, Mark Pilgrim, Joseph Walton, and Phil Ringnalda