Alfredo Canziani
@alfcnz.bsky.social
Musician, math lover, cook, dancer, 🏳️🌈, and an ass prof of Computer Science at New York University
To compute the movement of the state x(t), we need to temporally integrate its velocity field ẋ(t). 🤓
The control signal steering angle stays at 0, then 0.05π, then linearly to −0.20π. The vehicle moves along circumferences.
Finally, a sweep of initial velocity is performed.
The control signal steering angle stays at 0, then 0.05π, then linearly to −0.20π. The vehicle moves along circumferences.
Finally, a sweep of initial velocity is performed.
July 2, 2025 at 3:17 PM
To compute the movement of the state x(t), we need to temporally integrate its velocity field ẋ(t). 🤓
The control signal steering angle stays at 0, then 0.05π, then linearly to −0.20π. The vehicle moves along circumferences.
Finally, a sweep of initial velocity is performed.
The control signal steering angle stays at 0, then 0.05π, then linearly to −0.20π. The vehicle moves along circumferences.
Finally, a sweep of initial velocity is performed.
Currently, writing chapter 10, «Planning and control».
Physical constrains for the evolution of the state (e.g. pure rotation of the wheels) are encoded through the velocity of the state ẋ = dx(t)/dt, a function of the state x(t) and the control u(t).
Physical constrains for the evolution of the state (e.g. pure rotation of the wheels) are encoded through the velocity of the state ẋ = dx(t)/dt, a function of the state x(t) and the control u(t).
June 30, 2025 at 8:37 PM
Currently, writing chapter 10, «Planning and control».
Physical constrains for the evolution of the state (e.g. pure rotation of the wheels) are encoded through the velocity of the state ẋ = dx(t)/dt, a function of the state x(t) and the control u(t).
Physical constrains for the evolution of the state (e.g. pure rotation of the wheels) are encoded through the velocity of the state ẋ = dx(t)/dt, a function of the state x(t) and the control u(t).
Reposted by Alfredo Canziani
Oh! The undergrad feedback came in! 🥹🥹🥹
June 10, 2025 at 8:04 PM
Oh! The undergrad feedback came in! 🥹🥹🥹
Releasing the Energy-Book 🔋 from its first appendix's chapter, where I explain how I create my figures. 🎨
Feel free to report errors via the issues' tracker, contribute to the exercises, and show me what you can draw, via the discussion section. 🥳
github.com/Atcold/Energ...
Feel free to report errors via the issues' tracker, contribute to the exercises, and show me what you can draw, via the discussion section. 🥳
github.com/Atcold/Energ...
June 9, 2025 at 8:56 PM
Releasing the Energy-Book 🔋 from its first appendix's chapter, where I explain how I create my figures. 🎨
Feel free to report errors via the issues' tracker, contribute to the exercises, and show me what you can draw, via the discussion section. 🥳
github.com/Atcold/Energ...
Feel free to report errors via the issues' tracker, contribute to the exercises, and show me what you can draw, via the discussion section. 🥳
github.com/Atcold/Energ...
On a summer Friday night,
the first chapter sees the light.
🥹🥹🥹
the first chapter sees the light.
🥹🥹🥹
June 7, 2025 at 12:12 AM
On a summer Friday night,
the first chapter sees the light.
🥹🥹🥹
the first chapter sees the light.
🥹🥹🥹
Yeah, it took me 20 days to get back 🥹🥹🥹
I swear I respond to instant messages as they get through! 🥲🥲🥲
Anyhow, one more successful semester completed. 🥳🥳🥳
I swear I respond to instant messages as they get through! 🥲🥲🥲
Anyhow, one more successful semester completed. 🥳🥳🥳
May 29, 2025 at 6:34 PM
Yeah, it took me 20 days to get back 🥹🥹🥹
I swear I respond to instant messages as they get through! 🥲🥲🥲
Anyhow, one more successful semester completed. 🥳🥳🥳
I swear I respond to instant messages as they get through! 🥲🥲🥲
Anyhow, one more successful semester completed. 🥳🥳🥳
In this lecture from my new undergrad course, we review linear multiclass classification, leverage backprop and gradient descent to learn a linearly separable feature vector for the input, and observe the training dynamics in a 2D embedding space. 🤓
youtu.be/saskQ-EjCLQ
youtu.be/saskQ-EjCLQ
April 9, 2025 at 6:32 PM
In this lecture from my new undergrad course, we review linear multiclass classification, leverage backprop and gradient descent to learn a linearly separable feature vector for the input, and observe the training dynamics in a 2D embedding space. 🤓
youtu.be/saskQ-EjCLQ
youtu.be/saskQ-EjCLQ
Training of a 2 → 100 → 2 → 5 fully connected ReLU neural net via cross-entropy minimisation.
• it starts outputting small embeddings
• around epoch 300 learns an identity function
• takes 1700 epochs more to unwind the data manifold
• it starts outputting small embeddings
• around epoch 300 learns an identity function
• takes 1700 epochs more to unwind the data manifold
April 8, 2025 at 4:19 AM
Training of a 2 → 100 → 2 → 5 fully connected ReLU neural net via cross-entropy minimisation.
• it starts outputting small embeddings
• around epoch 300 learns an identity function
• takes 1700 epochs more to unwind the data manifold
• it starts outputting small embeddings
• around epoch 300 learns an identity function
• takes 1700 epochs more to unwind the data manifold
Reposted by Alfredo Canziani
Did you enjoy Alfredo Canziani's lecture as much as we did?! If so, check out his website to find more about his educational offer: atcold.github.io
You can also find really cool material on Alfredo's YouTube channel! @alfcnz.bsky.social
You can also find really cool material on Alfredo's YouTube channel! @alfcnz.bsky.social
March 11, 2025 at 5:36 PM
Did you enjoy Alfredo Canziani's lecture as much as we did?! If so, check out his website to find more about his educational offer: atcold.github.io
You can also find really cool material on Alfredo's YouTube channel! @alfcnz.bsky.social
You can also find really cool material on Alfredo's YouTube channel! @alfcnz.bsky.social
Reposted by Alfredo Canziani
📣 A pocos días del comienzo del Khipu 2025, nos complace anunciar que tanto las actividades del salón principal como el acto de clausura del viernes se retransmitirán en directo por este canal: khipu.ai/live/. ¡Los esperamos!
khipu.ai
March 7, 2025 at 11:30 AM
📣 A pocos días del comienzo del Khipu 2025, nos complace anunciar que tanto las actividades del salón principal como el acto de clausura del viernes se retransmitirán en directo por este canal: khipu.ai/live/. ¡Los esperamos!
I *really* had a blast giving this improvised lecture on a topic requested on the spot without any sleep! 🤪
The audience seemed enjoying the show. 😄
To find more about my educational offer, check out my website! atcold.github.io
Follow here and subscribe on YouTube! 😀
The audience seemed enjoying the show. 😄
To find more about my educational offer, check out my website! atcold.github.io
Follow here and subscribe on YouTube! 😀
➡️ The lecture ML & DL Foundamentals II with Alfredo Canziani @alfcnz is now starting at #KHIPU2025
Remember you can watch it live at khipu.ai/live
Remember you can watch it live at khipu.ai/live
March 11, 2025 at 11:24 AM
I *really* had a blast giving this improvised lecture on a topic requested on the spot without any sleep! 🤪
The audience seemed enjoying the show. 😄
To find more about my educational offer, check out my website! atcold.github.io
Follow here and subscribe on YouTube! 😀
The audience seemed enjoying the show. 😄
To find more about my educational offer, check out my website! atcold.github.io
Follow here and subscribe on YouTube! 😀
In today's episode, we review the concepts of loss ℒ(𝘄, 𝒟), per-sample loss L(𝘄, x, y), binary cross-entropy cost ℍ(y, ỹ) = y softplus(−s) + (1−y) softplus(s), ỹ = σ(𝘄ᵀ𝗳(x)).
Then, we minimised the loss by choosing convenient values for our weight vector 𝘄.
@nyucourant.bsky.social
Then, we minimised the loss by choosing convenient values for our weight vector 𝘄.
@nyucourant.bsky.social
February 11, 2025 at 9:14 PM
In today's episode, we review the concepts of loss ℒ(𝘄, 𝒟), per-sample loss L(𝘄, x, y), binary cross-entropy cost ℍ(y, ỹ) = y softplus(−s) + (1−y) softplus(s), ỹ = σ(𝘄ᵀ𝗳(x)).
Then, we minimised the loss by choosing convenient values for our weight vector 𝘄.
@nyucourant.bsky.social
Then, we minimised the loss by choosing convenient values for our weight vector 𝘄.
@nyucourant.bsky.social
Tue morning: *prepares slides*
Tue class: *improv blackboard lecture*
Outcome: unexpectedly great lecture.
Thu morning: *prep handwritten notes*
Thu class: *executes blackboard lecture*
Students: 🤩🤩🤩🤩🤩🤩🤩🤩🤩
@nyucourant.bsky.social @nyudatascience.bsky.social
Tue class: *improv blackboard lecture*
Outcome: unexpectedly great lecture.
Thu morning: *prep handwritten notes*
Thu class: *executes blackboard lecture*
Students: 🤩🤩🤩🤩🤩🤩🤩🤩🤩
@nyucourant.bsky.social @nyudatascience.bsky.social
January 30, 2025 at 8:20 PM
Tue morning: *prepares slides*
Tue class: *improv blackboard lecture*
Outcome: unexpectedly great lecture.
Thu morning: *prep handwritten notes*
Thu class: *executes blackboard lecture*
Students: 🤩🤩🤩🤩🤩🤩🤩🤩🤩
@nyucourant.bsky.social @nyudatascience.bsky.social
Tue class: *improv blackboard lecture*
Outcome: unexpectedly great lecture.
Thu morning: *prep handwritten notes*
Thu class: *executes blackboard lecture*
Students: 🤩🤩🤩🤩🤩🤩🤩🤩🤩
@nyucourant.bsky.social @nyudatascience.bsky.social
I think the new undergrad course is going well. At least we're having fun! 😁😁😁
Cosine (extract from undergrad DLSP25)
YouTube video by Alfredo Canziani (冷在)
youtu.be
January 23, 2025 at 9:19 PM
I think the new undergrad course is going well. At least we're having fun! 😁😁😁
Reposted by Alfredo Canziani
If you're interested in creative coding or simulations, come read The Nature of Code with us!
I'm setting up a chill book club to read the book and share projects starting in January. DM me for info!
I'm setting up a chill book club to read the book and share projects starting in January. DM me for info!
December 20, 2024 at 6:47 PM
If you're interested in creative coding or simulations, come read The Nature of Code with us!
I'm setting up a chill book club to read the book and share projects starting in January. DM me for info!
I'm setting up a chill book club to read the book and share projects starting in January. DM me for info!
Reposted by Alfredo Canziani
I was reading about Conway's Game of Life, and this whole amazing book is available online!
The Nature of Code - by Daniel Shiffman
Thank you @shiffman.net for making it free for the web!
People, please consider buying a copy of the book :)
natureofcode.com/cellular-aut...
The Nature of Code - by Daniel Shiffman
Thank you @shiffman.net for making it free for the web!
People, please consider buying a copy of the book :)
natureofcode.com/cellular-aut...
7. Cellular Automata
In Chapter 5, I defined a complex system as a network of elements with short-range relationships, operating in parallel, that exhibit emergent behavio
natureofcode.com
December 10, 2024 at 11:27 PM
I was reading about Conway's Game of Life, and this whole amazing book is available online!
The Nature of Code - by Daniel Shiffman
Thank you @shiffman.net for making it free for the web!
People, please consider buying a copy of the book :)
natureofcode.com/cellular-aut...
The Nature of Code - by Daniel Shiffman
Thank you @shiffman.net for making it free for the web!
People, please consider buying a copy of the book :)
natureofcode.com/cellular-aut...
It takes me weeks to detox from a stressful work mindset and finally sleep in — without waking at dawn, driven by the haste of unfinished tasks.
I struggle to turn on my laptop to do any work.
When I do restart, the anxiety comes back instantaneously. Is this common? ☹️☹️☹️
I struggle to turn on my laptop to do any work.
When I do restart, the anxiety comes back instantaneously. Is this common? ☹️☹️☹️
January 9, 2025 at 12:23 AM
It takes me weeks to detox from a stressful work mindset and finally sleep in — without waking at dawn, driven by the haste of unfinished tasks.
I struggle to turn on my laptop to do any work.
When I do restart, the anxiety comes back instantaneously. Is this common? ☹️☹️☹️
I struggle to turn on my laptop to do any work.
When I do restart, the anxiety comes back instantaneously. Is this common? ☹️☹️☹️
Preparing an “Intro to Deep Learning” blackboard undergraduate course.
Number of currently enrolled students: zero. 🥹🥹🥹
Current motivation: very weak.
Number of currently enrolled students: zero. 🥹🥹🥹
Current motivation: very weak.
January 7, 2025 at 10:52 PM
Preparing an “Intro to Deep Learning” blackboard undergraduate course.
Number of currently enrolled students: zero. 🥹🥹🥹
Current motivation: very weak.
Number of currently enrolled students: zero. 🥹🥹🥹
Current motivation: very weak.
Reposted by Alfredo Canziani
You’re still arguing about tabs vs. spaces? May I present…
December 25, 2024 at 6:37 PM
You’re still arguing about tabs vs. spaces? May I present…
Reposted by Alfredo Canziani
Entropy is one of those formulas that many of us learn, swallow whole, and even use regularly without really understanding.
(E.g., where does that “log” come from? Are there other possible formulas?)
Yet there's an intuitive & almost inevitable way to arrive at this expression.
(E.g., where does that “log” come from? Are there other possible formulas?)
Yet there's an intuitive & almost inevitable way to arrive at this expression.
December 9, 2024 at 10:44 PM
Entropy is one of those formulas that many of us learn, swallow whole, and even use regularly without really understanding.
(E.g., where does that “log” come from? Are there other possible formulas?)
Yet there's an intuitive & almost inevitable way to arrive at this expression.
(E.g., where does that “log” come from? Are there other possible formulas?)
Yet there's an intuitive & almost inevitable way to arrive at this expression.
Vancouver’s cannellés are delicious! 😋
Get one at Granville Island public market!
Get one at Granville Island public market!
December 16, 2024 at 12:54 AM
Vancouver’s cannellés are delicious! 😋
Get one at Granville Island public market!
Get one at Granville Island public market!
UltraPixel: Advancing Ultra-High-Resolution Image Synthesis to New Peaks
December 14, 2024 at 1:52 AM
UltraPixel: Advancing Ultra-High-Resolution Image Synthesis to New Peaks
Scalable optimisation in the modular norm. Direct to the point. 😀😀😀
December 14, 2024 at 1:46 AM
Scalable optimisation in the modular norm. Direct to the point. 😀😀😀