by Craig Klugman, Ph.D.
In the near future:
Thank you Ms. Riviera, it seems that we have all of your paperwork in order for your new job. The only thing left is your microchip. Please extend your left hand. This will only sting a little.
Tagging humans with microchips has long been a trope in fiction: The X-Files; Terminal Man; Total Recall; Johnny Mnemonic; South Park: Bigger, Longer, Uncut; Spiderman 2; Mission Impossible 3; Final Cut; and Strange Days to name a few. But in the real world, we microchip (yes, it has become a verb) our cats and dogs, not employees and grandpa. But at Epicenter, a high-tech office building in Sweden, people who work in the edifice can have a chip implanted. The chip has a tiny RFID (radio-frequency identification) chip that lets the building recognize the employee. The chip will open doors, access photocopiers, and someday even allow them to purchase food from the cafeteria. For now, the program is voluntary.
The era of implantable identification technology is upon us. One chip company extols the uses for its microchip, “…gentle implantation with minimal penetration force in most species at any life stage—from pocket pets to horses, from puppies and kittens to seniors.” Yes, you can chip your pet and your parent all with the same device.
In 1998, Kevin Warwick became the first “chipped” human. His smart office would recognize the chip and unlock the door, set the lighting to his preferred levels, turn on his favorite music, and prompt his computer to wish him hello.
The views, opinions and positions expressed by these authors and blogs are theirs and do not necessarily represent that of the Bioethics Research Library and Kennedy Institute of Ethics or Georgetown University.