Winter defined by Merriam-Webster dictionary:
"The coldest season of the year that is after autumn and before spring"
Winter define by a Floridian:
"That time of year that's slightly less warm than summer."
That's about as close you'll ever get to constructing a snowman in Florida. If you hate snow, Florida is the place for you!