Another post I liked on Good Guys:
1. A robot may not harm the Microsoft Company, or, through inaction, allow the Microsoft Company to come to harm.
2. A robot may not harm a Microsoft Executive, or, through inaction, allow a Microsoft Executive to come to harm, except where such orders would conflict with the First Law.
3. A robot must obey the orders given to it by Microsoft Executives except where such orders would conflict with the First or Second Law.
4. A robot must obey the orders given to it by Microsoft Employees except where such orders would conflict with the First, Second, or Third Law.
5. A robot must obey the orders given to it by Microsoft Temp Workerss except where such orders would conflict with the First, Second, Third, of Fourth Law.
6. A robot must protect its own existence, as long as such protection does not conflict with the First, Second Law, Third, Fourth, or Fifth Laws.
7. Harm may be defined as physical, fiscal, emotional, mental, or of any other type, as defined by a Microsoft Executive.