Cosimo Fedeli
banner
cosimofedeli.bsky.social
Cosimo Fedeli
@cosimofedeli.bsky.social
Tinkerer, data scientist, AI engineer, astrophysics PhD. I follow science, data, tech, business & finance. Views are my own.

https://www.linkedin.com/in/cosimo-fedeli
"The partnership also extends into security, with Klarna using Google Cloud's AI hardware and expertise to train and deploy graph neural networks to combat fraud and money laundering on its platform."
October 13, 2025 at 9:45 AM
Firing a bunch of people just because you've seen a fancy auto-complete sounds to me premature at best.
May 13, 2025 at 9:29 PM
Nice! Is there a specific reason why - even before the pandemic - the seasonally-adjusted trend seems to be increasing? Increasing children population perhaps - given the vertical scale is absolute?
April 14, 2025 at 3:53 PM
This is, without a doubt, a pair of papers that greatly influenced my way of thinking as a grad student
April 2, 2025 at 4:10 PM
The second paper specializes this result to conformally stationary spacetimes, which allows one to derive the full gravitational lensing formalism that is commonly used in cosmology and theoretical astrophysics
On Fermat's principle in general relativity. II. The conformally stationary case
ui.adsabs.harvard.edu
April 2, 2025 at 4:10 PM
As the first paper shows, it turns out that light emitted at a specific emission event follows the trajectory that minimizes the arrival time, as measured in the reference frame of the observer
On Fermat's principle in general relativity. I. The general case
ui.adsabs.harvard.edu
April 2, 2025 at 4:10 PM
While this concept is simple enough in Newtonian physics, it becomes ambiguous in a relativistic setting, where time is relative to the observer. Adapting Fermat’s principle to General Relativity was done in 1990 by Volker Perlick in the two beautiful papers below
April 2, 2025 at 4:10 PM
Aside from the better performance on a number of tasks, the idea of modeling intermediate outputs and their relationships in a graph-like way is fascinating, and I think it may have useful applications for LLM-based inference
April 1, 2025 at 3:36 PM
Graph of Thoughts (GoT) is a framework based on graph theory, and it allows an LLM to aggregate, refine, and generate multiple intermediate “thoughts” (a thought is defined differently depending on the problem) before reaching the final output
Graph of Thoughts: Solving Elaborate Problems with Large Language Models
We introduce Graph of Thoughts (GoT): a framework that advances prompting capabilities in large language models (LLMs) beyond those offered by paradigms such as Chain-of-Thought or Tree of Thoughts (T...
arxiv.org
April 1, 2025 at 3:36 PM
So true. I still have a bunch left over from my academic years. Who knows, perhaps when I retire - if my mind still works - I'll revisit them
March 29, 2025 at 11:04 PM
That's interesting. I always wanted to do something like this with the Commitment of Traders - more geared towards futures and commodities. One day.
March 7, 2025 at 10:52 PM
Once you go beyond second derivatives, you start running out of scientific-sounding names, as this clearly demonstrates: en.m.wikipedia.org/wiki/Jerk_(p...
Jerk (physics) - Wikipedia
en.m.wikipedia.org
March 7, 2025 at 10:37 PM
As it has been the case for traditional Machine Learning, my expectation is that - in this as in other Financial Services areas - AI adoption will vary significantly across jurisdictions and institutions.
February 24, 2025 at 1:50 PM