Dave Siegel
daveasiegel.bsky.social
Dave Siegel
@daveasiegel.bsky.social
Professor of Political Science and Public Policy at Duke University. Associate Editor (Formal Theory) of the AJPS. daveasiegel.com
Probably should've tagged this thread: #Polisky #PolScience
May 22, 2025 at 3:29 PM
This was very much a group effort. Most of my co-authors aren't here, but I'll tag the amazing @margaretfoster.bsky.social. We hope this tool can be widely useful, and are happy to answer any questions you might have! (7/7)
May 22, 2025 at 2:46 PM
Right now one needs to convert to binary input data to use IRTM, but we're also working on extensions to other data types. We provide a vignette in the package to walk the user through IRTM's use. (6/7)
May 22, 2025 at 2:46 PM
Because you can run IRTM on large datasets with complex loading constraints, it opens up theoretically-driven measurement to a lot more contexts. We focus on the importance of matching measurement to theory in the paper, but are working on making clearer the range of uses in ongoing work. (5/7)
May 22, 2025 at 2:46 PM
It's also user-friendly, requiring little coding or package knowledge: just give it input data and a constraint matrix for the pre-specification, and it will provide individual-level posterior distributions over latent dimensions. No need to learn any package-specific syntax. (4/7)
May 22, 2025 at 2:46 PM
IRTM trades off some generality in specification, as packages such as brms or blavaan can achieve, for substantially increased speed: it runs on data with 10k respondents and 200+ items with 4 latent dimensions in about 10 minutes, regardless of the complexity of the pre-specified connections. 3/7
May 22, 2025 at 2:46 PM
IRTM is a Bayesian implementation of an Item Response Theory model in a similar vein to other constrained approaches such as BCFA or BSEM: it allows the user to pre-specify connections between items/questions and theoretically-coherent latent dimensions. (2/7)
May 22, 2025 at 2:46 PM
🚩
December 27, 2024 at 2:47 PM
Great news!
November 18, 2024 at 6:26 PM
Thanks for this! I'd love to be added, if possible.
November 16, 2024 at 1:48 PM
Thanks for putting this together! Would love to be added.
November 13, 2024 at 11:05 AM
Forgot the figure!
September 13, 2024 at 4:24 PM
This is NSF-funded work, joint with Marco Morucci, Margaret Foster, Katie Webster, and So Jin Lee. None of whom are on Bluesky yet. (8/8)
September 13, 2024 at 3:03 PM
We think the framework has the potential to help a lot of measurement problems. We’re working right now to extend the input data from dichotomous to multichotomous and continuous as well. It may be of interest to #econsky as well. (7/8)
September 13, 2024 at 3:02 PM
There’s a lot more in the paper. For example, the figure shows how IRT-M can produce estimates of abstract concepts (sense of "threat", by the media sources that they trust) in data not designed to measure the concepts (the Eurobarometer survey). (6/8)
September 13, 2024 at 3:02 PM
The package does the rest. The latent dimensions it captures can be correlated, and IRT-M discovers any such correlation from the data. The supervised steps ensure that the measures remain consistent across time and space. (5/8)
September 13, 2024 at 3:02 PM
Then, researchers identify data sources and assess how the latent dimensions would show up in the data, which can depend on context. Then, they construct a constraint matrix that encodes dependencies between items in the data and the latent dimensions. (4/8)
September 13, 2024 at 3:02 PM
We call this framework IRT-M, and it makes it easier for researchers to construct, measure, and present subtle or abstract concepts in their data. It’s a semi-supervised method. First, researchers identify theoretically meaningful latent dimensions. (3/8)
September 13, 2024 at 3:01 PM
There’s an accompanying R package with a vignette as well here: github.com/dasiegel/IRT-M. (2/8).
GitHub - dasiegel/IRT-M: IRT-M R package
IRT-M R package. Contribute to dasiegel/IRT-M development by creating an account on GitHub.
github.com
September 13, 2024 at 3:01 PM