How I scaled a website to 10 million users (web-servers & databases, high load, and performance)
-
Fronted
->Load balancer
->Web server (your main web server)
-> a caching layer which is on the web server usually -> if data not in cache thencalls database server
-
Front end will receive where to get content -> calls
Content delivery networks
for content
Apache issue
- Max number of connections is like 1000. Solution, optimize queries. Database queries are probably too slow. It’s not made for realtime services.
For images on a server
You may run out of space on the server. You can use symbolic links to remote mount a file system such as your upload server. This allows for basically infinite disk space because you can keep mounting additional remote points that keep connecting to additional file systems.
Primary concern for scaling would be database server
Database server is most likely to collapse.
- Make sure all database queries are optimized and indexed. Make sure all queries are using keys and indexes.
- Keep queries simple. Try to do joins on the client rather than server side. Do joins on the client side to avoid additional load on the server side if you can.
- If there are calculations that need to be computed. Preferred to have values precomputed by a cron job then just insert it into the table.
- If everything is already optimized, then just keep updating hardware.
- More ram you have then more chances the whole database will be in ram so queries will be faster.
- Try master slave database replication. They are essentially copies of each other but there is a little lag in between
- Master is source of truth which you write into.
- Slave is a copy that you can read (no writing).
- With this one slave server, you can easily make multiple copies of it. Without the slave you would have to pause the master, make a whole copy of it, then bring everything back up online again.
- This helps spread the load. Rather than reading from master, you read from slave which spreads the laod.
- Often you dont need very consistent reads so this is a good setup. Master is used only for writes and if a consistent read is needed.
Data sharding
- take user id. mod by 10 and depending on that, they send the user to 1 of 10 different servers.
- Take tables not connected such as a chat table and a message table, and a user id table and just have different servers for each.
For real time chat
For realtime chat you have to hold onto database connection longer. Use some language optimized for handling multiple concurrent connections at the same time.:
- node js
- golang
- alexer
Load Balancer
- Enginks
- Softlayer offers a loadbalancer
Caching layer
memcache on web server
Content delivery network
-
For large static assets, they take up bandwidth on your load server. So its nice to have it on a seperate server machine and provided via a content delivery network.
-
Theses servers save the large files on servers closer to the user.
Amazon Web services and Google Services
- Avoids because they get you locked up with them and cost are kinda high sometimes.
NOTE:
- never had code endpoints to webserver, chatserver, image server. They should be in variables so if you need to change endpoints you can do so easily which could help quickly distribute load if needed.