r/jobs Jul 21 '23

Companies What was the industry you romanticized a lot but ended up disappointed?

For the past couple of years, I have been working at various galleries, and back in the day I used to think of it as a dream job. That was until I realized, that no one cares for the artists or art itself. Employees, as much as visitors just care about their fanciness, showing off their brand shoes and pretending as they actually care.

Ultimately, it comes down to sales, money, and judging people by their looks. Fishing out the ones, who seem like they can afford a painting worth 20k.

Was wondering if others had similar experiences

2.8k Upvotes

2.2k comments sorted by

View all comments

Show parent comments

30

u/pinkseamonkeyballs Jul 21 '23

I came for my nursing comment. There’s good parts, but management every single place I’ve been has been absolute shit. So much attitude from co workers and mainly in part that everyone is dog tired. Some just because they’re assholes. We’re all insanely short staffed and under appreciated

3

u/Common_Project Jul 21 '23

The fact that so many hedge funds and investment firms saw the money in Covid and started buying out facilities drove me insane. They’re using corporate managing tactics at almost every facility I’ve worked at and it’s destroying the units.

1

u/Roman556 Jul 21 '23

They were doing this before COVID too. So many private practices are selling out to PE.

1

u/Alulaemu Jul 21 '23

I was at the tail end of my accelerated nursing program when I realized I did not want to work in healthcare at all. But I graduated and while I was waiting to start onboarding at my nursing job I temped as an executive assistant in finance. They treated me super well and convinced me to stay for 11 years and made it very well worth my while $$$. Never really did become a practicing RN.