Matthew Craig
banner
matthewjacraig.bsky.social
Matthew Craig
@matthewjacraig.bsky.social
Assistant Professor @Central Michigan University. Privacy; HMC; HCI; HRI; Emerging media and technology. Nature lover and dog parent. 🏳️‍🌈www.matthewjacraig.com
This work was inspired by Dr. Sandra Petronio’s work in theorizing Communication Privacy Management theory. Sandra unfortunately passed away on April 20, 2024. Without her pivotal original work, this research would not have been possible. (9/9, thanks for reading the whole thread! 🧵)
September 9, 2025 at 9:45 PM
This study extends Communication Privacy Management (CPM) theory into the human-algorithm context, helping us understand how people negotiate privacy boundaries with machines.

👉 Read the full article here: doi.org/10.1016/j.ch... (8/9)
September 9, 2025 at 9:45 PM
🎯 Big takeaway: Privacy fatigue doesn't always mean giving up. Sometimes it makes users more guarded—pushing back on the idea that social media surveillance is just "the cost of being online." (7/9)
September 9, 2025 at 9:45 PM
🔹 …when users are highly aware AND highly fatigued in privacy management with social media algorithms, they lock down even more—granting fewer "co-ownership rights" of their private information to platforms via their algorithms. (6/9)
September 9, 2025 at 9:45 PM
🔹 The more aware people are of social media algorithms, the less open they are in granting platforms access to their private information via their algorithms.

And here's the twist… (5/9)
September 9, 2025 at 9:45 PM
🔹 People who like their algorithms (positive affect) are more willing to share private information; those who dislike them (negative affect) pull back. (4/9)
September 9, 2025 at 9:45 PM
In my new article in Computers in Human Behavior, I surveyed 1,305 Facebook, Instagram, and TikTok users to examine how algorithm awareness and privacy fatigue influence the way they share (or refrain from sharing) private information with these platforms.

Here’s what I found 🔍 : (3/9)
September 9, 2025 at 9:45 PM
half said they try to "ignore the privacy breakdown, scroll past it, and even express emotional responsiveness, like feeling defeated," what we called "passive coping" (p. 231).

👀But what happens when people feel tired of protecting their privacy on social media? 🥱🔒 (2/9)
September 9, 2025 at 9:45 PM
This work was heavily inspired by Dr. Sandra Petronio’s work in theorizing CPM theory. While completing this research, which was one part of several studies & analyses, Sandra unfortunately passed away on April 20, 2024. Without her pivotal original work, this research would not have been possible.
September 2, 2025 at 3:34 PM
By using CPM theory to investigate this privacy dilemma, this work provides insight into privacy mgmt issues with sm platforms via algorithms and how breakdowns lead to certain recalibration practices and inspire future work investigating human-machine communication privacy management (HMCPM).
September 2, 2025 at 3:34 PM
🎯 Big takeaway: These algorithmic intrusions don’t just annoy users; they shake their sense of privacy, control, and trust in social media platforms.
September 2, 2025 at 3:34 PM
🔎 In response to these algorithmically driven privacy breakdowns, people shared that they may try to ignore the ad, reset their privacy settings, delete social media apps, or even resigning to the idea that surveillance is “the cost” of being on social media.
September 2, 2025 at 3:34 PM
🔎 Users encounter a wide array of recommended content and ads on social media that seem eerily related to their personal searches, conversations online and offline, medical or health-related information, and a variety of cmc platforms, despite having taken steps to restrict access to user info.
September 2, 2025 at 3:34 PM