NPR News tells about how “Apple Will Scan U.S. iPhones For Images Of Child Sexual Abuse“. Basically, Apple announced that by next year they will use a program to scan all the iPhones in the United States. The program will search the user’s photos for child pornography or child sexual abuse. If it finds a potential photo, it will send it to review by a human. Also, Apple will monitor text messages for the same information. They also said that any sexually explicit messages sent to or from minors under 13 on the phones will be flagged. They will then blur the images and report them to the parents.
In theory, this sounds wonderful, but some people have concerns. They point out that this could be considered a breach of privacy. One person mentions that a knowledgeable person could trick the system and frame someone. On the other hand, the company could even abuse this program to search for things not related to child sexual abuse. Still, others are not as concerned. They feel the need to find these abusers of children is more important than the potential problems. Apple says that this program does not affect any privacy or security issues.
Opinion on Apple Program
Sexually abusing a child is horrible and needs to stop. Still, this program makes me uneasy. We’ve gone from innocent until proven guilty, to guilty unless proven innocent. Using this program, Apple basically assumes everyone sexually abuses children. Through the program, they can prove innocence or guilt. Yet, searching everyone assumes guilt first. For this reason, I cannot agree with them implementing this program. Despite their assurances, this does breach privacy and security. There is no way to tell how this program will be used or modified in the future. Before we find out, it needs to stop before it starts.
Some may ask, then how else should we catch the people that sexually abuse children? I would say, by teaching. First, we need to teach teachers, daycare workers, and parents how to recognize the signs of abuse and sexual grooming by adults. Next, we should set up a clear system of reporting these signs for investigation. We should also encourage these people working with children to report the signs as often as they see them. They should not be made to feel like it is a burden or overactive imagination. Third, we need to teach the children. Parents or even teachers should teach the children simple ways to defend themselves. They should even teach children how to recognize an adult grooming them for abuse. Finally, we need to go back to communicating with each other. The more we interact the more we can tell if a child needs help.
Though not perfect, I think these methods would better help to find abusers. Programs miss things all the time. It is only through real training, teaching, and interaction that we can learn when something is wrong. What do you think? Should Apple go through with this program or does it breach privacy and security too much? What is a better way to find child abusers?