West is most commonly a noun, adjective, or adverb indicating direction or geography.
West is the direction towards which the sun sets at the equinox. It is one of the four cardinal points of the compass, upon which it is considered the opposite of East, and at right angles to North and South.
"The West" also often refers to Western countries. When used in this sense, it could mean anything from NATO, Europe and North America with or without Japan to whole Judeo-Christian civilisation. This meaning of the word merges with the concept of Western society.
See also: Western movie