WEST
noun
Definitions
- 1. The point in the heavens where the sun is seen to set at the equinox; or, the corresponding point on the earth; that one of the four cardinal points of the compass which is in a direction at right angles to that of north and south, and on the left hand of a person facing north; the point directly opposite to east. And fresh from the west is the free wind's breath. Bryant.
- 2. A country, or region of country, which, with regard to some other country or region, is situated in the direction toward the west.
- 3. Specifically:
- 4. The Westen hemisphere, or the New World so called, it having been discovered by sailing westward from Europe; the Occident.
Other Definitions
This word also has 3 other definitions:
WEST
(adjective)
Lying toward the west; situated at the west, or in a western direction from the point of observation or reckoning; proceeding toward the west, or comi...
WEST
(adverb)
Westward.
WEST
(verb)
1. To pass to the west; to set, as the sun. [Obs.] "The hot sun gan to west." Chaucer.
2. To turn or move toward the west; to veer from the north or ...
Added: October 09, 2025
Updated: October 09, 2025