0

So,basically what i want to do is send multiple get requests from the client to the web server all at once.My objective is to load test the server so it is important that all the requests are sent simultaneously. I have read about GIL so and it seems that multithreading would not be possible in python.Multiprocessing seems to be too heavy weight and the client ends up crashing when testing a large number of users.Is there a way to achieve parallelism using the threading module itself? What other alternatives do I have?

import http.client
import del1  #a file to delete all the files from the destination folder each time the program runs
import urllib.request
import time
import os

t=float(input('enter time in seconds\t'))
ip=input('enter server ip address(default value:192.0.0.1)\t')
n=input('enter no of users (default value:1)\t')
file_name=input('Enter the file name (default value:testing2.txt)\t')
pr=input('enter protocol (default value:http)\t')
if ip=='':
    ip='192.0.0.1'
if file_name=='':
    file_name='testing2.txt'
if pr=='':
    pr='http'
if n=='':
    n=1
s_ip = ' '

for i in range(3,3+int(n)):
    s_ip='192.0.0.'+str(i)
    os.system('sudo ip address add '+s_ip+'/24 dev ens33')





site = urllib.request.urlopen(link)
num = 0
print("file size:"+str(site.length))
t_end = time.time()+t
con = http.client.HTTPConnection(ip,source_address=(s_ip,80))
while time.time()<t_end:
    con.request("GET","/"+file_name)
    r1 = con.getresponse()
    data1 = r1.read()
    handle = open('/home/client1/test/try'+str(num)+'.txt','wb')
    handle.write(data1)   
    num+=1

print("number of files downloaded is:"+str(num))
through = (num*(site.length)/t)
through=through/(1024*1024)
through=round(through,4)
print("Throughput = "+str(through)+"MBps")

This is my code until now. It works for a single user. But I don't know how to go about it for multiple users.Would using a different implementation of python like Jython be useful?

2 Answers 2

1

Strictly speaking, you can not send multi requests at the same time. That is to say, if you want to send multi requests at definitely the same time, this is very, very hard. It is based on your hardware, OS and some other conditions. What is easier is that you send requests at nearly the same time, it depends whether you accept or deny the duration between each request. Below code shows a demo which sends 1000 requests within about 1 second, I advise you to try jmeter, loadrunner or gatling for alternatives.

import asyncio
import requests
import aiohttp
import datetime

async def fetch(session, url):
    start_time = datetime.datetime.now()
    print(start_time)
    async with session.get(url) as response:
        return await response.text()

async def main():
    base_url = "http://your_url:your_port"
    urls = [base_url for i in range(1000)]
    tasks = []
    async with aiohttp.ClientSession() as session:
        for url in urls:
            tasks.append(fetch(session, url))
        htmls = await asyncio.gather(*tasks)
        # for html in htmls:
        #     print(html[:100])

if __name__ == '__main__':
    loop = asyncio.get_event_loop()
    loop.run_until_complete(main())    

Furthermore, here is a question regarding the use of python3, asyncio and aiohttp.

Sign up to request clarification or add additional context in comments.

1 Comment

See rendezvous mechanism inside of a traditional performance testing tool if you wish to synchronize all requests to be submitted as close to simultaneous as possible. You have many choices for tools in this case. But, be aware, such models violate the client-server model for human users, which is predicated upon chaotic arrival and departure of client requests.
0

Have you look at Locust? It provides an easy to use syntax in Python to describe user behaviour or test scenarios and an UI to configure the load test like peak concurrent user or spawn rate.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.