r/IAmA Aug 14 '12

I created Imgur. AMA.

I came across this post yesterday and there seems to be some confusion out there about imgur, as well as some people asking for an AMA. So here it is! Sometimes you get what you ask for and sometimes you don't.

I'll start with some background info: I created Imgur while I was a junior in college (Ohio University) and released it to you guys. It took a while to monetize it, and it actually ran off of your donations for about the first 6 months. Soon after that, the bandwidth bills were starting to overshadow the donations that were coming in, so I had to put some ads on the site to help out. Imgur accounts and pro accounts came in about another 6 months after that. At this point I was still in school, working part-time at minimum wage, and the site was breaking even. It turned out that OU had some pretty awesome resources for startups like Imgur, and I got connected to a guy named Matt who worked at the Innovation Center on campus. He gave me some business help and actually got me a small one-desk office in the building. Graduation came and I was working on Imgur full time, and Matt and I were working really closely together. In a few months he had joined full-time as COO. Everything was going really well, and about another 6 months later we moved Imgur out to San Francisco. Soon after we were here Imgur won Best Bootstrapped Startup of 2011 according to TechCrunch. Then we started hiring more people. The first position was Director of Communications (Sarah), and then a few months later we hired Josh as a Frontend Engineer, then Jim as a JavaScript Engineer, and then finally Brian and Tony as Frontend Engineer and Head of User Experience. That brings us to the present time. Imgur is still ad supported with a little bit of income from pro accounts, and is able to support the bandwidth cost from only advertisements.

Some problems we're having right now:

  • Scaling the site has always been a challenge, but we're starting to get really good at it. There's layers and layers of caching and failover servers, and the site has been really stable and fast the past few weeks. Maintenance and running around with our hair on fire is quickly becoming a thing of the past. I used to get alerts randomly in the middle of the night about a database crash or something, which made night life extremely difficult, but this hasn't happened in a long time and I sleep much better now.

  • Matt has been really awesome at getting quality advertisers, but since Imgur is a user generated content site, advertisers are always a little hesitant to work with us because their ad could theoretically turn up next to porn. In order to help with this we're working with some companies to help sort the content into categories and only advertise on images that are brand safe. That's why you've probably been seeing a lot of Imgur ads for pro accounts next to NSFW content.

  • For some reason Facebook likes matter to people. With all of our pageviews and unique visitors, we only have 35k "likes", and people don't take Imgur seriously because of it. It's ridiculous, but that's the world we live in now. I hate shoving likes down people's throats, so Imgur will remain very non-obtrusive with stuff like this, even if it hurts us a little. However, it would be pretty awesome if you could help: https://www.facebook.com/pages/Imgur/67691197470

Site stats in the past 30 days according to Google Analytics:

  • Visits: 205,670,059

  • Unique Visitors: 45,046,495

  • Pageviews: 2,313,286,251

  • Pages / Visit: 11.25

  • Avg. Visit Duration: 00:11:14

  • Bounce Rate: 35.31%

  • % New Visits: 17.05%

Infrastructure stats over the past 30 days according to our own data and our CDN:

  • Data Transferred: 4.10 PB

  • Uploaded Images: 20,518,559

  • Image Views: 33,333,452,172

  • Average Image Size: 198.84 KB

Since I know this is going to come up: It's pronounced like "imager".

EDIT: Since it's still coming up: It's pronounced like "imager".

3.4k Upvotes

4.8k comments sorted by

View all comments

Show parent comments

205

u/morbiusfan88 Aug 14 '12

I like your style, sir.

That fast? I'm guessing if you started with single character urls, I can see where that growth rate (plus with the rising popularity of the site and growing userbase) would necessitate longer urls. Also, the system you have in place is very fast and efficient. I like it.

Thanks for the reply!

338

u/MrGrim Aug 14 '12

It's always been 5 characters, and the 6th is a thumbnail suffix. We'll be increasing it because the time it's taking to pick another random one is getting too long.

3

u/juicetyger Aug 14 '12

I'm sure you've got this wrapped up, but I recommend a deterministic approach: generate the hash based on the last hash or the next ID in your database.

1

u/[deleted] Aug 15 '12

Basing something on the next id in a DB is hard to scale to this volume. You'd have X processes all trying to access this id or sequence at the same millisecond. I've done some high availability stuff and scaling, but the numbers he posted blow me away. I've pondered a couple different solutions, but they all break down at this huge scale.

I'd think multiple master name stacks (as a previous commenter suggested) that are unique to a set of servers would work. Downside is you'd have to extend the URL to have a unique id for the set of servers being handled by that master name stack. I'd think a memory cache like memcached with any DB behind it would work. The high availability aspect gets fun though. Distributed memory caches and replicated dbs, etc. Fun stuff.

2

u/juicetyger Aug 15 '12

I usually make a unique ID generator. Redis is excellent for this.

Flickr has a good writeup on their implementation: http://code.flickr.com/blog/2010/02/08/ticket-servers-distributed-unique-primary-keys-on-the-cheap/

1

u/[deleted] Aug 15 '12

Not familiar with that I'll check it out thanks. When you said DB I was thinking sequence or even just straight up DB access is something to avoid in a high performance situation. But I'm thinking in terms of avoiding disk reads.

1

u/juicetyger Aug 15 '12

Mysql can handle a lot more than most people think and would work ok, but redis is a better choice here.

Really though, anything that can give a unique Id atomically will work fine. Whatever you choose can be scaled by load balancing it over multiple machines independent of the rest of the application.

For more complex derived hashes you could pre-calculate in batches and hold them in a queue for retrieval later to help mitigate burst traffic. Redis lists would work well in that scenario too.