SD - thread reading cheating?

When the hamsters are MIA, and a certain user(I won`t use names), would like to read, say, several threads without waiting for EVER, can the certain user open a seperate IE session(window) for each thread to be read?

Will this foul things up more if everyone does this?

Say four sessions are open at once. Does this slow down the server as if four more users where on? Or will it just take me …er, the user, four times as long to recieve the info.

In short is this a good or bad thing.?

How does this effect the SD server and internet traffic in general if lots of people did this?

**When the hamsters are MIA, and a certain user(I won`t use names), would like to read, say, several threads without waiting for EVER, can the certain user open a seperate IE session(window) for each thread to be read?
**

Yes, said user may do this. However, if the boards are extremely slow, and said user is a responsible member of the board, (s)he may choose to open as few threads as (s)he can.

**
Will this foul things up more if everyone does this?
**
If everyone did this, certainly it would increase the load on the server. However, you must understand here that the load on the server is mostly due to searches performed using the search function of this board, and a lot less by actually opening threads. So, the general suggestions are :

  1. Use http://www.boardreader.com to perform searches where possible

  2. Perform searches preferably at non-peak times and when the board is running smooth. Keep the search criteria to the minimum(e.g. only yesterday, or only GQ, or only thread titles)

  3. Say Hi to Opal once in a while.

**
Say four sessions are open at once. Does this slow down the server as if four more users where on? Or will it just take me …er, the user, four times as long to recieve the info.
**

It would certainly add to the load on the server. Yes, this is akin to four different users opening one of the four threads each (as far as the server is concerned). On your side, it will take you less time having opened the four threads simultaneously rather than sequentially. But you would have noticed by now that if the board is slow, whether you open a single thread or multiple simultaneous threads, it still takes forever to load. But when the board is fast, the simultaneous threads load really quickly. So my personal preference is to wait till the board is fast and then open multiple threads simultaneously, before it goes slow again.

**
In short is this a good or bad thing.?
**
It’s a bad thing if you over do it, or do it all the time when the board is slow. A few threads opened simultaneously, I assume, would be normal doper behaviour. It’s an extremely bad thing if you perform your searches when the board is slow.

**
How does this effect the SD server and internet traffic in general if lots of people did this? **

Every thread you read adds to the load on the server, and to the servers across the Internet in general.

Try to understand it this way: The bandwidth that is available to the board is limited. Thus, it can only send a limited amount of data at a given time. The more people that want to access that data (open threads, post, search, etc.), the slower it gets for everybody. The more data each person requests (by opening more threads, etc.) the slower still that it gets.

And, you would further reduce the load on the server if you posted the question in the correct forum to begin with. A considerable amount of the limited resources is used to move a thread to the correct forum. This question should have been asked in the ATMB forum.

Hang on a minute xash.

If I am going to read 4 threads, then I am going to impose upon the server the workload involved in opening four threads. Whether I do that all at once or at four seperate times does not alter the workload involved.

If I open all four now, that just means I don’t open three later.

With thousands of users all accessing the boards at once, the workload is going to average out, in the medium term.

I don’t see that it makes any difference.

Yes, the last time this came up (in About This Message Board, incidentally), there were people on both sides. Some believed that it would help, and some believed that it would hurt. It may be of note that TubaDiva was in the latter group. So at the very least, you’re safe if you avoid this measure. One of the posters came up with some sort of formulation, the details of which escaped me, having to do with the setup of the server, that would yield the optimum board-surfing strategy. However, I think that the specs necessary to come to a conclusion were never supplied.

Since this is a technical question about this message board, I’ll move this thread to ATMB.

I understand your point of view… but if you read my earlier post clearly, you might discern the difference… so let’s see if I can explain the difference more explicitly…

Ok…

Case 1: 4 threads. Sequentially, one after the other (assume about a minute to read each thread)

Case 2: 4 threads. Simultaneously.

Now, when the boards are fast, I’ll agree with you that either would be fine and affect the board within degrees of each other.

But, and here’s where the difference lies… when the boards are slow… Case 1 would be easier on the servers than Case 2. Assume all users follow Case 1 only, or all users follow Case 2 only.

Now, for the sake of understanding… let’s assume 100 users (inluding unregistered lurkers… yes, you there in the cute pink tank top, go register NOW :wink: ) on the board simultaneously.

Now, assume that all 100 users either follow Case 1 only, or all 100 follow case 2 only (this can be extended to the actual number of users at any given time, the concept is the same) so that we understand the effect of the different cases on the server, at a given time.

If all 100 users follow Case 2 and the boards are still fast, cool… we’re all happy.

But, let’s assume that when all 100 users follow Case 2 at a given time (i.e. at the same time) they are effectively opening 400 threads at the same time. Let’s assume that this causes a slowdown in the board response time, because we have reached the bandwidth limit at 400 thread calls. Now, if all 100 users continue to observe Case 2 method of reading threads, the board remains slow. But here you argue that these 100 have already opened their 4 threads each, so it ain’t gonna get much slower. Ok, so assume 100 more users join in. They enter when the boards are already slow (which hasn’t happened until now in our experiment). If they observe Case 2 browsing, the server will further choke. Agreed ? Good. So now let’s assume that these same 100 (the new 100) came in when the boards were already slow, but chose to observe Case 1 browsing, that’s 300 less thread calls ? 'coz 100 users chose to load only 1 thread each at the same time. So, the server is now loading 400 threads for old 100 users and 100 more for new 100 users. Now let’s say they all settle down and start reading their respectively loaded threads… the load on the server reduces ('coz the 300 extra threads haven’t been called for when the board was already choking)… the new 100 users now load 1 more each. At this time they realize that the board is pretty fast (since the server has served up the previous 400+100 threads… and now only had to serve up 100 more). This is when the new 100 users switch to case 2 and load their remaining threads simultaneously… and since the boards are fast (100+200 simultaneous requests… below the slow/choke level of 400 requests) the threads load fast. So they keep at Case 2, till such time that the board starts choking again. This leads to a repetition of the cycle.

What you have to understand here is the concept of requests at the same time, when the board is already slow. You have seen how the new 100 users, being all disciplined members, chose to use Case 1 to allow the server to remain at (or return to) pre-choke levels.

So, other than the numbers being way off, I hope I have made the concept a bit clearer. Yes, I understand that we cannot determine howmany users are on at any given time and that this number keeps fluctuating… but what we can understand is that loading the server when it’s slow is not as good an idea as loading it when it is fast.

Just thought of this analogy. Highway. Cars. You have 4 each. Everybody takes all 4 at the same time. Chaos. When there’s a traffic jam you take out just 1 each. A bit more peaceful. As traffic eases, you bring out the rest of your cars. Dunno if that makes sense…

Ok… now you may flame me :slight_smile:

A simpler explanation of how multiple threads increases the load. Most of us are time constrained. We only have a set number of hours (or minutes) a day in which to read. The rest of our life keeps us busy the rest of the time. Say that time window is 2 hours. You have 2 hours in which to open any threads that you want to read, post to, etc. If you run 4 simultaneous thread requests at a time, then you can conceivably read 4 times the number of threads, because the loading is concurrent, so the effective wasted time for you is less. Assuming the board was slow whether you were requesting or not (i.e. four requests at a time is small), that is still four times the number of thread requests total for you. Whereas if you only open threads sequentially, then the waiting time comes out of the number of thread requests you can make, and you only make 1/4 the number of thread requests in your 2 hours. Your total load usage is reduced.

Now obviously this explanation implies it is better for you to run multiple windows - which is intuitive to everyone. But the downside is the bandwidth usage, which effects everyone. So are you selfish or nice?

xash, your example sounds really contrived. 100 users all logging in at the same time? You could just as easily come up with a user distribution that would result in inopportune times for Case 1. As for your highway example, you have to contrast taking out four cars with taking out one car four times, not just once.

Irishman, your point about wasting more time with Case 1 is valid. But you could say that about anything that wastes time artifically. Forcing someone to sit through two minutes of a blank screen every time they posted would have the same effect, but I don’t think anyone argues that it’s a good idea.

I will use the first post I’ve been able to make in several days, due to the extreme slowness of the Board and my total inability to connect, to say that there is not an easy answer to this which can be derived with the information available. You would have to do an analysis of the usage and http logs to determine if one practice or another is the worst one for the Board.

I know of what I speak, for I have a low bandwidth Board, and I have to be sensitive to this.

I will add this - on my Board, as a result of long-term testing, I have determined that opening multiple windows at once appears to make the Board reading and serving faster. Since my bandwidth is so limited, it is very easy for me to test this. Of course, whether or not this is scalable depends on many factors. One serious advantage that I have is that, unlike the SDMB, I have my entire database, search indexes, and table caches entirely in memory. So that will obviously make a difference.

I guess what I’m saying is I feel that this question cannot be answered accurately by anyone except the Admin who has direct server access.

One more angle that wasn`t covered.

User has 4 sessions open at once.
All four sessions have the same IP address(?).
Does the server alternate sending packets to session1, then session2, then session3, and finally session4, or can it send all packets simultaneously?
In other words does the server possibly “see” the four sessions open as just one session because of the IP address?
Even though you are requesting more info at the same time, will you only receive it at the same speed regardless of sessions.
Another way to ask - will each of the 4 sessions recieve a quarter of a packet whereas one session would receive the whole thing?
Is quantum mechanics easier to explain?

I don’t see anything unreal about assuming 100 users at a given time. Infact, I think it’s within the actual number of users on at peak times.

I, personally, don’t see any times when it is more favourable to the server that everybody uses Case 2. This is because, every user that observes Case 1 is limiting his/her pull on the server at a given time. So, whatever the number of users online, with everybody using Case 1, the load on the server is automatically limited because the maximum possible calls have been limited.

e.g.:

100 users Case 1 - A maximum possible of 100 calls at any given instant.

100 users Case 2 - A maximum possible of 400 calls at any given instant.

Therefore, for any distribution of users, at any given time, the limit of maximum possible calls for Case 1 will always be less than the maximum possible calls at that same instant, for the same number of users, if the users were observing Case 2 instead. It also follows that every user who chooses to observe Case 1 instead of Case 2 is reducing the maximum possible calls on the server at that instant.

I don’t have to contrast taking out 4 cars with taking out 1 car four times. I have to contrast taking out 4 cars at the same time to taking out those same 4 cars at different times. And this I have done through my analogy.

Ach, just to clarify, I’m not trying to thrust my opinion as the facts in relation to this board and its bandwidth. I’m just offering what seems to me an obvious conclusion with regards to differing board-browsing patterns.

I am, ofcourse, open to seeing an example where Case 2 would be more beneficial to the board than Case 1, at any given time, for the same number of users. And Irishman has put it well that Case 2 is clearly beneficial to the individual user.

Yeah, but there’s a big difference between 100 users being logged on at the same time, and 100 users logging on at the same time. Do you see?

What about this? Within the last 20 minutes or so (about however long it takes to view four threads) 100 users have logged on at different times and are still logged on. If they all used Case 1, then they’re getting pages one at a time, and there are, say, 30 connections open. If they all used Case 2, then they’ve already gotten their pages, and 0 connections are open. User 101 comes by. Which is better for User 101? 30 open connections or 0 open connections?

Yes, but the fact that the Case 1 maximum possible value is lower does not imply that the Case 1 actual value is lower.

Okay, fair enough. In that case what I’m saying is that it’s not obvious.

Arguement taken. Now it gets complicated :slight_smile:

Well, I really would like an answer to this, if it’s possible. I must admit, it surprised me to find out that Anthracite’s board does better under Case 2; I would have expected them to be about the same. But what do I know?

I would like to help out the board however I can (I even set my posts per page to 20! I’m so noble.) but in this case I don’t know what to do. So I usually use Case 2 because it’s more convenient for me, but when there’s any delay at all in loading threads, I only keep two or three connections open at a time, just in case it is worse.

Ok, since you wanna take it further… and since I wanna figure it out too…

Here’s a theory, combining all the scenarios mentioned…

Assumption:

When the board is really fast, Case 2 is the better option (both for the user and the server). This is because, since the board is fast, it’s not under much load… and using Case 2 at this point (let’s call it t=0) allows for all the bandwidth at t>0 to be used by anyone who comes along later (your 101st poster, for instance).

But, when the boards are slow… this indicates that there are already too many calls to the server… any additional load will just aggravate the situation… so in this case, at board-slow times, Case 1 would be better for the server… till such time that wide-spread Case 1 behaviour brings the server below board-slow levels, and everyone can go back to Case 2 behaviour.

Rather, rinse, repeat.

Comments ?

Preferred method for navigating this board? Here’s a way to surf the SDMB much faster.

xash I don’t accept your explanation at all. You rely on examples that assume everyone is coming onto and off the boards at the same time (which is entirely inaccurate), and you combine that with an apparent unwillingness to consider the fact that if you open four threads now, that’s four threads that you do not open later.

Say everyone is using the “four at once” strategy. And say the server is very busy. I come online and request four threads. You point out that this imposes a high load compared to the load if I just reqested one thread. But what you leave out of the equation is that at any given moment, for everyone imposing a high load using my strategy, there are others who are imposing no load at all because they are reading the four threads that they opened earlier. In other words, this strategy results in people imposing four times the load, one quarter as often. And over multiple persons, that equation comes out to one.

Or to come at it from the other direction, say everyone is using the “one at once” strategy. And say the server is very busy. I come on line and open one thread. You point out that this imposes a lower load compared to the load if I reqested four theads. But what you leave out of the equation is that at any given moment, for everyone imposing a low load using your strategy, there are four times as many others who are also imposing a low load because they have run out of things to read because they did not open four threads earlier. In other words, this strategy results in people imposing one quarter the load four times as often. And over multiple persons, the equation comes out to one.

Another possible confounding factor: When the board is slow, it looks like the board waits for the resources to become available, and in the meantime just ignores the request (tech folks, is this accurate?). Now, obviously it’d be best if the server were serving out threads when it’s got the resources avaialble, rather than when it doesn’t. But I, the user, don’t know when this will be. So I send five requests to the server at once, and let it decide when to take care of them. So what typically happens is I have five threads pending, and I’m sitting here playing Minesweeper waiting for them to open. Then all of a sudden, the server has some free time on its hands, and there’s some work ready for it, so it’s not sitting idle. Then, I read those threads while I’m accumulating some more, and I can afford to wait until the server has a free moment.

Chronos, Are you muddling things up for those who dont have the time to sweep mines? Are your pending requests slowing the board even more, to the point that other users get so frustrated that they open up more requests and the thing snowballs? Part of the original OP was questioning the "ethics" of this although I didnt use that word.

If five requests are being sent, that’s no big deal (I would think) in and of itself. But if they’re all sent by one person, that might mean that four other people might not get responses to their own requests and are consequently frustrated. The five-request person might be happy when at least one request gets through, but those who who only sent in one request might come away empty handed.