This is very difficult to guess at because the browsing needs of users vary widely and so does the amount of information required to display a page. If a user is flicking through videos three per minute at high definition that one user might well saturate a WiFi link. On the other hand, if 200 users are filling in text forms this load might not even be detectable.
Actual knowledge is always better than a guess. Get five typical users, set them down at five test machines and get them working, then periodically look at the total data requirements and the peak bandwidth as reported by the internet firewall. Five users are needed so that the load can be reasonably averaged. Then divide the peak load and peak bandwidth by five (five test users) and add 50% for a safety margin. That's what is required for one user. Then multiply that by 50 and see what the required bandwidth is.
I would note that in this situation WiFi is not ideal. The WiFi spectra, both 2.4 GHz and 5.0 GHz, are cluttered with everybody-and-their-dog's networks, and in medium to large cities it is a fight to get half of what is claimed for WiFi due to everyone wanting that bandwidth. Continuous video streaming eats up WiFi bandwidth quickly, and 50,000 people doing it all at once from 5 to 10 PM ... guess what happens to available WiFi for everyone else. There is much to be said for wired networking, and not the least of it is that everyone on their own wire can get the full bandwidth of their wire (which is more than the full bandwidth of WiFi) 24 hours a day.
Jason Johanknecht
How many users per access point are you expecting? Most AP units max out at 25 concurrent users (Recommended). Cheap AP units I see have trouble beyond 5 users.
How much area must you cover? Users connecting with a weak signal will hurt the overall results.
Actual knowledge is always better than a guess. Get five typical users, set them down at five test machines and get them working, then periodically look at the total data requirements and the peak bandwidth as reported by the internet firewall. Five users are needed so that the load can be reasonably averaged. Then divide the peak load and peak bandwidth by five (five test users) and add 50% for a safety margin. That's what is required for one user. Then multiply that by 50 and see what the required bandwidth is.
I would note that in this situation WiFi is not ideal. The WiFi spectra, both 2.4 GHz and 5.0 GHz, are cluttered with everybody-and-their-dog's networks, and in medium to large cities it is a fight to get half of what is claimed for WiFi due to everyone wanting that bandwidth. Continuous video streaming eats up WiFi bandwidth quickly, and 50,000 people doing it all at once from 5 to 10 PM ... guess what happens to available WiFi for everyone else. There is much to be said for wired networking, and not the least of it is that everyone on their own wire can get the full bandwidth of their wire (which is more than the full bandwidth of WiFi) 24 hours a day.