According to Apple's website, Siri is the "intelligent personal assistant" that accompanies a person's iPhone, that uses voice recognition to, among other things, answer questions about location resources, or basic information. For example, you can ask Siri where the closest sushi restaurant is, or how far it is to the nearest hospital.
However, there's a critique of Siri's limitations
that has gone viral, because many are challenging the reasoning behind
its faults. Apparently, Siri is not capable of offering results that
will direct a person to a birth control clinic or the nearest abortion services,
but it is perfectly fine addressing needs for Viagra or Pregnancy
Crisis Centers (which are anti-abortion). In fact, in some instances,
those who ask for an abortion clinic are directed to Pregnancy Crisis Centers, when in fact there are nearby places that provide abortion services.
unclear as to whether Siri's limitations and flat-out mistakes are
purposeful or just the result of poor programming. Or, as my esteemed
colleague Shelagh Johnson
hypothesizes, maybe it's a matter of terms like "birth control" or
"abortion" being so absent from general discourse, that they are not
readily available in Siri's lexicon. Johnson is also asking great
questions such as "is Siri gay-friendly?" and how it reacts to questions
related to HIV testing (note, the link I provide for Shelagh Johnson
does not go to her critique of Siri, but to a series of interviews about
her work. Her inquiry into this issue has, for now, been limited to
For now, all we know is that Siri is not too helpful
for those seeking certain reproductive health services. Let's hope this
flaw is corrected with newer versions or a patch. After all, I would not
be surprised if young persons will rely on Siri for the answers to very
important questions that will have long-term impacts on their health.