Everyone in the UK has been to Florida. And half of them have also been to LA.
For many of the people I spoke to, this is their only experience of America. Disneyland, palm trees, sandy beaches. This is not the America I know.
I have never been to Florida. Never been to Disney Land or Disney World. And I think I could go my while life without ever going to those places because, as most of you already know, I hate people. And the places have a lot of people.
I don't mean you, dear readers. I mean I hate people I don't know. The people that cut in line, and talk loudly on their phones, and take up two parking spots just for the hell of it. The people that you'd probably find a lot of at theme parks.
And yet, I feel like I have to go to Florida at least once, even if we skip Disney World. Because I'm curious about what people see when they visit. If this is the only place in the U.S. that the Brits visit, what is their impression of our country?
Obviously, it's a warmer and sunnier place than Ohio. But what else is its great allure? Maybe someday I'll make it to Florida, and I can ask all of the Brits staying in our hotel to explain it to me.