Dan R Greening
banner
dan.greening.org
Dan R Greening
@dan.greening.org
AI researcher, computer scientist, agile coach, rural activist, scuba diver, snowboarder, world traveler, serial entrepreneur, secular Buddhist, gay
Grateful for everyone, especially Matt Hildreth, for putting on this great @ruralorganizing.bsky.social event.
April 28, 2025 at 8:36 PM
3. Democrats must SHOW UP (with branding) in communities as contributors. This means having desks at rodeos, pitching in at community events, speaking at civic meetings, gaining competency, and running for office.
...
April 28, 2025 at 8:36 PM
2. Democrats must inspire hard-working, independent, and caring voters (in that order). Too often we lead with "caring," but our brand already owns that sector. If anything, voters see us as "too caring."
...
April 28, 2025 at 8:36 PM
1. Democrats lost the 2024 election (and face continuing headwinds) due to their failure to inspire young voters, especially young men (but young women, too), in rural counties.
...
April 28, 2025 at 8:36 PM
So next time someone suggests you examine some concept or build some new skill, let them know you’ll put your minds to it.
December 15, 2024 at 2:55 AM
I mentioned this to pal Errol Arkilic (UCI Chief Innovation Officer), and he said, Have you read “Society of Mind,” by Marvin Minsky? That book seems a lot like this “multi-headed attention” notion in Transformers.
December 15, 2024 at 2:55 AM
Different minds (with different queries) are called “heads” in the Transformer architecture.
December 15, 2024 at 2:55 AM
ut in a Transformer architecture, it gets a bit more complicated. Different minds will explore different queries, like maybe What fun stuff can *I* do in “Africa”? We might then get a different set of Keys (Key:drinksbeer or Key:entertaininglecturer for “Jane”?)
December 15, 2024 at 2:55 AM
This creates a cloud of meanings around each word, informed by the other words in the sentence. If you’re like me, that will make you pause for a minute and ask, “Is that what I’m doing when I comprehend a sentence?”
December 15, 2024 at 2:55 AM
Through repeated thinking, a Transformer forms a Query for each word, and answers that query with key/value vectors from each other word in the sentence (and, actually, even that same word).
December 15, 2024 at 2:55 AM
But if Jane is an infant in the context of Africa, Key:deepunderstanding would likely have a low value.
December 15, 2024 at 2:55 AM
There are other key-value pairs associated with this query about “Africa.” For example, deepunderstanding might be another Key. If Jane is a primatologist, based on surrounding words, then Key:deepunderstanding might have a high value.
December 15, 2024 at 2:55 AM
If the subject of the action in Africa is “Jane”, then Key:subject for “Jane” might have a high value (think: “Jane is the likely subject”) related to Africa in this sentence.
December 15, 2024 at 2:55 AM
Other words fill in some of the “features” associated with that word’s Query. We call each feature a Key. For example, one key could be the subject doing something in Africa.
December 15, 2024 at 2:55 AM
each word poses some questions, and possibly one main question. For example, (depending on other words around it) “Africa” might prompt the question What’s happening in Africa? Call that Q… the Query.
December 15, 2024 at 2:55 AM
I’m checking it out on a flight tomorrow to Maui…
December 3, 2024 at 1:33 AM
If there was a @mauidreamsdiveco.com BlueSky account, you could inform BlueSky users about events, etc. and pals could follow it. :) Point your social media folks at this post?
November 24, 2024 at 5:20 AM