khush04preet.bsky.social
@khush04preet.bsky.social
24/7 police access may increase crime deterrence and response efficiency, but it also risks mass surveillance and privacy erosion. Without strict limits, long-term monitoring can undermine civil liberties.
January 19, 2026 at 4:14 AM
Facial recognition by police should be restricted or paused because it disproportionately misidentifies minority groups. Using biased technology risks discrimination and undermines public trust.
January 19, 2026 at 4:11 AM
Cities should be allowed to use high-tech surveillance cameras only with strict limits and oversight. While they can improve public safety, they also risk violating privacy if used without clear legal authority and transparency
January 19, 2026 at 4:10 AM
Should government agencies be allowed to use large-scale surveillance tools to track everyday citizens without their knowledge in the name of security?
January 18, 2026 at 7:27 PM
Police should only use AI facial recognition in very limited situations. If privacy and bias risks are not controlled, the technology could do more harm than good, even if it helps identify suspects.
January 17, 2026 at 4:13 AM
Police should not use AI facial recognition on body cameras until privacy and bias issues are fully addressed. Without clear limits and transparency, the technology could harm civil liberties and public trust.
January 17, 2026 at 4:11 AM
Police should be cautious using AI facial recognition on body cameras because it can harm privacy and make biased mistakes. Without strong rules and oversight, it could violate civil liberties.
January 17, 2026 at 4:10 AM
Should police use AI facial recognition in body cameras if it risks privacy and bias against certain groups?
January 17, 2026 at 4:07 AM
U.S. tech companies should be partly responsible if they know their products are used for surveillance or human rights abuses. Stronger rules can help prevent misuse.
January 17, 2026 at 3:55 AM
U.S. tech companies should be held accountable to a limited extent, especially if they know their technology is used for mass surveillance. While they may not control how governments use their products, still have a responsibility to act ethically. Clear laws and regulations can help reduce misuse
January 17, 2026 at 3:54 AM
U.S. technology companies should have some responsibility if they know their products are used for mass surveillance or human rights abuses. However, technology can be used in many ways, and companies cannot fully control how governments use it. Stronger rules are needed to reduce misuse.
January 17, 2026 at 3:52 AM
Should U.S. technology companies be held morally and legally responsible for how their products are used by foreign governments — especially when those products help build extensive surveillance systems that can facilitate human rights abuses?
January 17, 2026 at 3:46 AM
Should U.S. technology companies be held morally and legally responsible for how their products are used by foreign governments — especially when those products help build extensive surveillance systems that can facilitate human rights abuses?
January 17, 2026 at 3:40 AM