The definition of a hand job is a vulgar slang term for the action of a person stimulating a man's penis.
(noun)When a woman uses her hands to masturbate her husband to orgasm, this is an example of a hand job.
See hand job in Webster's New World College Dictionary