Home Page > > Details

DTS101TCHelp With ,Help With Python Programming

School of Artificial Intelligence and Advanced Computing
Xi’an Jiaotong-Liverpool University
DTS101TC Introduction to Neural Networks
Coursework
Due: Sunday Apr.21th, 2024 @ 17:00
Weight: 100%
Overview
This coursework is the sole assessment for DTS101TC and aims to evaluate your comprehension of the module. It consists of three sections: 'Short Answer Question', 'Image
Classification Programming', and 'Real-world Application Question'. Each question must be
answered as per the instructions provided in the assignment paper. The programming task
necessitates the use of Python with PyTorch within a Jupyter Notebook environment, with all
output cells saved alongside the code.
Learning Outcomes
A. Develop an understanding of neural networks – their architectures, applications and
limitations.
B. Demonstrate the ability to implement neural networks with a programming language
C. Demonstrate the ability to provide critical analysis on real-world problems and design
suitable solutions based on neural networks.
Policy
Please save your assignment in a PDF document, and package your code as a ZIP file. If there
are any errors in the program, include debugging information. Submit both the answer sheet
and the ZIP code file via Learning Mall Core to the appropriate drop box. Electronic submission
is the only method accepted; no hard copies will be accepted.
You must download your file and check that it is viewable after submission. Documents may
become corrupted during the uploading process (e.g. due to slow internet connections).
However, students themselves are responsible for submitting a functional and correct file for
assessments.
Avoid Plagiarism
• Do NOT submit work from others.
• Do NOT share code/work with others.
• Do NOT copy and paste directly from sources without proper attribution.
• Do NOT use paid services to complete assignments for you.
Q1. Short Answer Questions [40 marks]
The questions test general knowledge and understanding of central concepts in the course. The answers
should be short. Any calculations need to be presented.
1. (a.) Explain the concept of linear separability. [2 marks]
(b.) Consider the following data points from two categories: [3 marks]
X1 : (1, 1) (2, 2) (2, 0);
X2 : (0, 0) (1, 0) (0, 1).
Are they linearly separable? Make a sketch and explain your answer.
2. Derive the gradient descent update rule for a target function represented as
od = w0 + w1x1 + ... + wnxn
Define the squared error function first, considering a provided set of training examples D, where each
training example d ∈ D is associated with the target output td. [5 marks]
3. (a.) Draw a carefully labeled diagram of a 3-layer perceptron with 2 input nodes, 3 hidden nodes, 1
output node and bias nodes. [5 marks]
(b.) Assuming that the activation functions are simple threshold, f(y) = sign(y), write down the inputoutput functional form of the overall network in terms of the input-to-hidden weights, wab, and the
hidden-to-output weights, ˜wbc. [5 marks]
(c.) How many distinct weights need to be trained in this network? [2 marks]
(d.) Show that it is not possible to train this network with backpropagation. Explain what modification
is necessary to allow backpropagation to work. [3 marks]
(e.) After you modified the activation function, using the chain rule, calculate expressions for the following derivatives
(i.) ∂J/∂y / (ii.) ∂J/∂w˜bc
where J is the squared error, and t is the target. [5 marks]
4. (a.) Sketch a simple recurrent network, with input x, output y, and recurrent state h. Give the update
equations for a simple RNN unit in terms of x, y, and h. Assume it uses tanh activation. [5 marks]
(b.) Name one example that can be more naturally modeled with RNNs than with feedforward neural
networks? For a dataset X := (xt, yt)
k
1
, show how information is propagated by drawing a feedforward neural network that corresponds to the RNN from the figure you sketch for k = 3. Recall
that a feedforward neural network does not contain nodes with a persistent state. [5 marks]
Q2. Image Classification Programming [40 marks]
For this question, you will build your own image dataset and implement a neural network by Pytorch. The
question is split in a number of steps. Every step gives you some marks. Answer the questions for each step
and include the screenshot of code outputs in your answer sheet.
- Language and Platform Python (version 3.5 or above) with Pytorch (newest version).You may use
any libraries available on Python platform, such as numpy, scipy, matplotlib, etc. You need to run the code
in the jupyter notebook.
- Code Submission All of your dataset, code (Python files and ipynb files) should be a package in a single
ZIP file, with a PDF of your IPython notebook with output cells. INCLUDE your dataset in the zip
file.
Page 1
1. Dataset Build [10 marks]
Create an image dataset for classification with 120 images (‘.jpg’ format), featuring at least two categories. Resize or crop the images to a uniform size of 128 × 128 pixels. briefly describe the dataset you
constructed.
2. Data Loading [10 marks]
Load your dataset, randomly split the set into training set (80 images), validation set (20 images) and
test set (20 images).
For the training set, use python commands to display the number of data entries, the number of classes,
the number of data entries for each classes, the shape of the image size. Randomly plot 10 images in the
training set with their corresponding labels.
3. Convolutional Network Model Build [5 marks]
// pytorch.network
class Network(nn.Module):
def __init__(self, num_classes=?):
super(Network, self).__init__()
self.conv1 = nn.Conv2d(in_channels=3, out_channels=5, kernel_size=3, padding=1)
self.pool = nn.MaxPool2d(2, 2)
self.conv2 = nn.Conv2d(in_channels=5, out_channels=10, kernel_size=3, padding=1)
self.fc2 = nn.Linear(100, num_classes)
def forward(self, x):
x = self.pool(F.relu(self.conv1(x)))
x = self.pool(F.relu(self.conv2(x)))
x = self.fc1(x)
x = self.fc2(x)
return x
Implement Network, and complete the form below according to the provided Network. Utilize the symbol
‘-’ to represent sections that do not require completion. What is the difference between this model and
AlexNet?
Layer # Filters Kernel Size Stride Padding Size of
Feature Map
Activation
Function
Input
Conv1 ReLU
MaxPool
Conv2 ReLU
FC1 - - - ReLU
FC2 - - -
4. Training [10 marks]
Train the above Network at least 50 epochs. Explain what the lost function is, which optimizer do you
use, and other training parameters, e.g., learning rate, epoch number etc. Plot the training history, e.g.,
produce two graphs (one for training and validation losses, one for training and validation accuracy)
that each contains 2 curves. Have the model converged?
Page 2
self.fc1 = nn.Linear(10 * 32 * 32, 100)
x = x.view(-1, 10 * 32 * 32)
5. Test [5 marks]
Test the trained model on the test set. Show the accuracy and confusion matrix using python commands.
Q3. Real-world Application Questions [20 marks]
Give ONE specific real-world problem that can be solved by neural networks. Answer the questions below
(answer to each question should not exceed 200 words).
1. Detail the issues raised by this real-world problem, and explain how neural networks maybe used to
address these issues. [5 marks]
2. Choose an established neural network to tackle the problem. Specify the chosen network and indicate
the paper in which this model was published. Why you choose it? Explain. [5 marks]
3. How to collect your training data? Do you need labeled data to train the network? If your answer is
yes, specify what kind of label you need. If your answer is no, indicate how you train the network with
unlabeled data. [5 marks]
4. Define the metric(s) to assess the network. Justify why the metric(s) was/were chosen. [5 marks]
The End
Page 3
Marking Criteria
(1). The marks for each step in Q2 are divided into two parts
Rubrics Marking Scheme Marks
Program [60%]
The code works with clear layout and some comments. The outputs make some sense.
60%
The code works and outputs make some sense. 40%
Some of the component parts of the problem can be seen in the
solution, but the program cannot produce any outcome. The code
is difficult to read in places.
20%
The component parts of the program are incorrect or incomplete,
providing a program of limited functionality that meets some of
the given requirements. The code is difficult to read.
0%
Question Answer [40%]
All question are answered correctly, plentiful evidence of clear
understanding of the CNN
40%
Some of the answers not correct, convincing evidence of understanding of the CNN
20%
Answers are incorrect, very little evidence of understanding of the
CNN
0%
(2). Marking scheme for each sub-question in Q3
Marks Scope, quantity and relevance of studied material
Evidence of understanding (through
critical analysis)
5 High quality of originality. Extensive and relevant
literature has been creatively chosen, and outlined
and located in an appropriate context.
There is plentiful evidence of clear understanding of the topic.
4 Shows originality. The major key points and literature have been outlined and put in an adequate context. The major points of those sources are reasonably brought out and related in a way which reveals
some grasp of the topic in question.
There is convincing evidence of understanding
of the topic.
3 Effort has gone into developing a set of original ideas.
Some relevant key points and literature are outlined,
but this outline is patchy, unclear and/or not located
in an adequate context.
There is some evidence of understanding of the
topic.
2 May demonstrate an incomplete grasp of the task
and will show only intermittent signs of originality.
There are some mention of relevant key points, but
this outline is very patchy, unclear, and/or very inadequately placed in context.
There is limited evidence of understanding of
the topic.
1 Shows very limited ability to recognise the issues represented by the brief. There is little mention of relevant key points.
There is very little evidence of understanding
of the topic.
Page 4

Contact Us - Email:99515681@qq.com    WeChat:codinghelp
Programming Assignment Help!