NIHR Global Health Research Professor.
www.theguardian.com/environment/...
www.theguardian.com/environment/...
1 month?
3 months?
6 months?
12 months?
Thanks for responding.
1 month?
3 months?
6 months?
12 months?
Thanks for responding.
It’s some kind of magic, right?
It’s some kind of magic, right?
This will 💥 your 🧠.
This will 💥 your 🧠.
This fantastic team from the Netherlands led by Sami Simons are using voice diagnostics to help identify exacerbations of COPD.
All you have to do is say Aaaah!
Details 👇🏻
Thank you.
This fantastic team from the Netherlands led by Sami Simons are using voice diagnostics to help identify exacerbations of COPD.
All you have to do is say Aaaah!
Details 👇🏻
Thank you.
Exciting times ahead for our institutions and the communities we work with.
Exciting times ahead for our institutions and the communities we work with.
go.bsky.app/UAR96WC
Let me know if I've missed you out.
#COPD #Asthma #Emphysema #Breathless #Cough #Wheeze #FCTC #TobaccoControl #StarterPack
👋
X sent me a 7th Yr anniversary giff and seemed an apt time to join bluesky
🌟 please say Hello 😊
👋
X sent me a 7th Yr anniversary giff and seemed an apt time to join bluesky
🌟 please say Hello 😊
Seeing old #PhD students back in the lab, catching up on how their work has led to new studies, and sharing their experience with our current team.
So good to see you back, Dr Aalrajeh - thank you for stopping by! And congratulations. 😉
Seeing old #PhD students back in the lab, catching up on how their work has led to new studies, and sharing their experience with our current team.
So good to see you back, Dr Aalrajeh - thank you for stopping by! And congratulations. 😉
This interesting article (‘ChatGPT is bullshit’!) considers the philosophy of lying and concludes ChatGPT is not ‘hallucinating’, rather its designed to generate stuff that looks like truth without regard for accuracy. In other words, ‘bullshit’.
link.springer.com/article/10.1...
This interesting article (‘ChatGPT is bullshit’!) considers the philosophy of lying and concludes ChatGPT is not ‘hallucinating’, rather its designed to generate stuff that looks like truth without regard for accuracy. In other words, ‘bullshit’.
link.springer.com/article/10.1...