(Add your lessons, units, webquests, etc. here)

Artificial Intelligence Resources
http://www.airesources.info/

The Coming Technological Singularity
http://www-rohan.sdsu.edu/faculty/vinge/misc/singularity.html


Machine and Man: Ethics and Robotics in the 21st Century
http://www.thetech.org/robotics/ethics/index.html

Wikipedia: Technological Singularity
http://en.wikipedia.org/wiki/Technological_singularity

Singularity
http://www.singularity.org/

Natural Born Robots "Robots Have Feelings, Too!"
http://www.pbs.org/safarchive/4_class/45_pguides/pguide_1002/44102_feelings.html#abou

Asimov's Three Laws of Robotics
http://en.wikipedia.org/wiki/Three_Laws_of_Robotics




Questions:

Should we create intelligent robots at all?

Is the creation of an intelligent robot an act that only God should do?

Will there need to be some regulation about the creation of robots?

Will intelligent robots take away all forms of human employment?

Where are humans to derive their meaning and purpose in life?

If in the future machines have the ability to reason, be self-aware and
have feelings, then what makes a human being a human being, and a robot
a robot?

If you could have a robot that would do any task you like, a companion
to do all the work that you prefer not to, would you? And if so, how do
you think this might affect you as a person?

Are there any kind of robots that shouldn't be created? Or that you
wouldn't want to see created? Why?

Automation and the development of new technologies like robots is
viewed by most people as inevitable. But many workers who lose their jobs
consider this business practice unfair. Do you think the development
of new technologies, and their implementation, is inevitable? What, if
anything, should we as a society do for those people who lose their
jobs?