Lab Tasks
This week we have looked at HTTP Requests and Responses, and how they are used to transfer data too and from a server.
In this weeks Lab we will examine requests in more detail, and practice making them, both through a standard browser, and programmatically.
The Lab Machine
You will first need to download a copy of the tasks. They are on Github at 5067_Labs
We will be using this repository for several of the tasks, througout the year.
dang@danglaptop ~/Github/Teaching$ git clone git@github.coventry.ac.uk:CUEH/5067_Labs.git
Cloning into '5067_Labs'...
remote: Enumerating objects: 64, done.
remote: Counting objects: 100% (64/64), done.
remote: Compressing objects: 100% (46/46), done.
remote: Total 64 (delta 12), reused 64 (delta 12), pack-reused 0
Receiving objects: 100% (64/64), 178.86 KiB | 12.78 MiB/s, done.
Resolving deltas: 100% (12/12), done.
Docker
You are also going to need docker installed On Kali / Parrot / Ubuntu Debian you can use
$apt install docker-compose
To get all of the files you need.
Running the Machine
Once you have cloned the repository you can run the machine.
Navigate to the appropriate directory There should be a docker-compose file that sets everything up for you
dang@danglaptop ~/Github/Teaching/5067_Labs/Tasks/Requests$ ls main
compose-deploy.yml docker-compose.yml Dockerfile README.md RequestsTrainer REQUIREMENTS.txt
dang@danglaptop ~/Github/Teaching/5067_Labs/Tasks/Requests$
We can then start the machine with
dang@danglaptop ~/Github/Teaching/5067_Labs/Tasks/Requests$ docker-compose up main
Creating network "requests_default" with the default driver
Building flask
Sending build context to Docker daemon 966.7kB
Step 1/8 : FROM cueh/flask
---> 4451e308a879
Step 2/8 : USER root
---> Using cache
---> 59c3f173b1d5
Step 3/8 : RUN apt-get update && apt-get install -y --no-install-recommends ncat
---> Using cache
---> 31092275508e
Step 4/8 : COPY REQUIREMENTS.txt /tmp/REQUIREMENTS.txt
---> Using cache
---> dd51ee72937a
Step 5/8 : RUN pip install -r /tmp/REQUIREMENTS.txt
---> Using cache
---> 88b7d4058cfb
Step 6/8 : USER flask
---> Using cache
---> d9901e067216
Step 7/8 : WORKDIR /opt
---> Using cache
---> 23f51fd6fc41
Step 8/8 : ADD ./RequestsTrainer /opt/RequestsTrainer/
---> e45c94614df2
Successfully built e45c94614df2
Successfully tagged requests_flask:latest
WARNING: Image for service flask was built because it did not already exist. To rebuild this image you must use `docker-compose build` or `docker-compose up --build`.
Creating requests_flask_1 ... done
Attaching to requests_flask_1
flask_1 | * Serving Flask app '/opt/RequestsTrainer' (lazy loading)
flask_1 | * Environment: development
flask_1 | * Debug mode: on
flask_1 | * Running on all addresses.
flask_1 | WARNING: This is a development server. Do not use it in a production deployment.
flask_1 | * Running on http://172.20.0.2:5000/ (Press CTRL+C to quit)
flask_1 | * Restarting with stat
flask_1 | * Debugger is active!
flask_1 | * Debugger PIN: 127-334-760
This uses the magic of docker to create a container with flask, and all the other content we need to run the program, without you having to install anything (except docker)
You can now access the machine at the IP address given.
flask_1 | * Running on http://172.20.0.2:5000/ (Press CTRL+C to quit)
Stopping the Machine
Once you are done, you can stop the docker image with the following.
1) Stop the process using ++crtl+c++ 2) Run docker-compose down
flask_1 | * Debugger PIN: 127-334-760
^CGracefully stopping... (press Ctrl+C again to force)
Stopping requests_flask_1 ... done
dang@danglaptop ~/Github/Teaching/5067_Labs/Tasks/Requests$ docker-compose down main
Removing requests_flask_1 ... done
Removing network requests_default
dang@danglaptop ~/Github/Teaching/5067_Labs/Tasks/Requests$
Tasks Part 1: Requests and Response Headers.
The Requests Browser Trainer page in the trainer lets you see the HTTP Headers that are sent
as part of the request. The URL for the page is (
Different Browsers
Using a browser, take a look at the headers that the server receives.
You should get something like:
Method GET
Host 127.0.0.1:5000
User-Agent Mozilla/5.0 (X11; Linux x86_64; rv:87.0) Gecko/20100101 Firefox/87.0
Accept text/html,application/xhtml+xml,application/xml;q=0.9,image/webp,*/*;q=0.8
Accept-Language en-US,en;q=0.5
Accept-Encoding gzip, deflate
Dnt 1
Connection keep-alive
Referer http://127.0.0.1:5000/headers/viewHeaders
Cookie _pk_id.1.dc78=14ef10bea6a07780.1617533926.; session=eyJ1aWQiOjEsInVzZXJuYW1lIjoiZm9vQGJhci5uZXQifQ.YQ1YNA.yH6VGP_Lph4Aa8LPEXDCH7lDUHw
Upgrade-Insecure-Requests 1
Now take a look at what comes back using a different browser, what changes are there in the headers?
As the requests browser page returns a HTML page, it can be difficult to make out
the contents. There is a version that returns a JSON formatted version of the headers that
can make reading them easier, it can be found at (<server>/headers/requestJson
).
You may also want to take a look at the page in the browser, to see how it copes with JSON.
Using Software
We can also use command line tools to make the request.
Following the instructions in Request Tools examine the request headers using:
- Curl
- Netcat / Telnet
- Pythons Request Module
If you are using the JSON formatted page, you should see something like
dang@danglaptop ~$ curl http://127.0.0.1:5000/headers/requestJson
{"method": "GET", "headers": {"Host": "127.0.0.1:5000", "User-Agent": "curl/7.76.0", "Accept": "*/*"}, "args": {}, "body": {}}
We can also modify the request headers that get sent to the server.
This could allow us to change some functionality. For example, if a
site insists it will only work with IE7, modifying the user-agent
header might allow us access.
Changing the User-Agent
Our first "Challenge" is to deal with once such case.
The page under challenges/setUA
is behind a pay-wall.
However, the site developers will let googlebot index the page.
- Change the User-agent to
Googlebot/2.1
to view the page.
You may also want to try different ways of doing this. While there is only one flag, can you change the User-Agent :
- In the browser
- Using a tool like Burp
- Programmatically with Python?
Part 2: Sending Data to the server
For our second set of tasks we are going to look at sending data too the server.
If we look at the GetRequests (headers/getRequest) and PostRequests (headers/postRequest) pages you can use the form to send data to the server, and see how it responds.
You can use this form to practice for the challenges, and build your payloads.
Easytask
Play around with both GET and POST requests in the practice section, see how the data affects the request headers, and the data received.
-
Confirm we can change the type of the
<input>
in the form. For example changing the type of the password field to text instead of password. -
Using the form send a request with data.
- For GET requests see how the URL is updated with the data encapsulated in the query string
- For POST requests see how the request data is placed in the request body
-
Modify the data sent by the hidden field and see how it effects the information received by the server, and the data sent in either the URL or request body.
Our Second set of Challenges are based around sending data to the server. Here we are going to see how we can change data types and add more elements to the request.
While there is only one flag for each challenge, feel free to experiment with using different methods to get the data, for example using the browser, burp suite and python.
Data Challenge A
You can find this challenge under challenges -> Form Data /challenges/formData
For our first data based challenges we are going to modify the data sent by a form. You will also need to add an extra item to the data that is sent.
To get the first flag Send the following to the Server you will need form and / or the request.
- GET Request
- user = azreal
- date = -1
- secret = luther
For a bonus Flag send the same data but this time using a POST request.
Data Challenge B
We wont always have a form to get us started, in this challenge we are going to have to make a request to a page manually. This is similar to what we would have to do to access an web API
Send a POST request to the page at /challenges/APIdata/
The Request should have the following parameters
- POST Request
- user = abbadon
- role = despoiler
NOTE: If you ask the server will respond with JSON data. This can help with parsing the output
Part 3 Looking Closely at Responses
Looking at responses
Sometimes looking closely at the response data can give us a clue to the sites behaviour when unexpected things happen.
Try viewing the page at /challenges/Response
(challenges -> Response)
Take a close look at the data returned by the response, and see if you can find the flag.
Part 4 Automated Requests
For our final Challenge this week we are going to look at automating requests.
This can be super useful in the recon stage, as we can write things like fuzzers to help brute force passwords. Its also useful as it allows us to write a POC to automate our exploit process.
For the tasks, you might want to take a look at Python Requests to get the data. Either Requests-HTML or Beautiful Soup can be good for parsing the HTML returned
Automation 1
Our first automation task can be found at challenges/automate
(Challenges -> Automate)
Find a way of sending the hash back to the server within the time limit.
For the second task you are going to have to go through several stages, and parsing some reasonably complex data. We will also need to deal with some "anti hacker" protection in the form submissions, that means we need to parse complex data.
If you are not used to it the code is going to seem quite complex, but persevere with it and build on what we did in the previous challenge, its great practice for automating some of the complex stuff later.
Automation 2
Our second automation task can be found at /challenge/automateQuiz
- Defeat the CSRF protection to submit the first form
- Answer the questions in the second form to get a flag.