狠狠综合久久久久综合网址-a毛片网站-欧美啊v在线观看-中文字幕久久熟女人妻av免费-无码av一区二区三区不卡-亚洲综合av色婷婷五月蜜臀-夜夜操天天摸-a级在线免费观看-三上悠亚91-国产丰满乱子伦无码专区-视频一区中文字幕-黑人大战欲求不满人妻-精品亚洲国产成人蜜臀av-男人你懂得-97超碰人人爽-五月丁香六月综合缴情在线

代做COMP9414、代寫(xiě)C++,Java程序語(yǔ)言

時(shí)間:2024-06-20  來(lái)源:  作者: 我要糾錯(cuò)



COMP9414 24T2
Artificial Intelligence
Assignment 1 - Artificial neural networks
Due: Week 5, Wednesday, 26 June 2024, 11:55 PM.
1 Problem context
Time Series Air Quality Prediction with Neural Networks: In this
assignment, you will delve into the realm of time series prediction using neural
network architectures. You will explore both classification and estimation
tasks using a publicly available dataset.
You will be provided with a dataset named “Air Quality,” [1] available
on the UCI Machine Learning Repository 1. We tailored this dataset for this
assignment and made some modifications. Therefore, please only use the
attached dataset for this assignment.
The given dataset contains 8,358 instances of hourly averaged responses
from an array of five metal oxide chemical sensors embedded in an air qual-
ity chemical multisensor device. The device was located in the field in a
significantly polluted area at road level within an Italian city. Data were
recorded from March 2004 to February 2005 (one year), representing the
longest freely available recordings of on-field deployed air quality chemical
sensor device responses. Ground truth hourly averaged concentrations for
carbon monoxide, non-methane hydrocarbons, benzene, total nitrogen ox-
ides, and nitrogen dioxide among other variables were provided by a co-
located reference-certified analyser. The variables included in the dataset
1https://archive.ics.uci.edu/dataset/360/air+quality
1
are listed in Table 1. Missing values within the dataset are tagged
with -200 value.
Table 1: Variables within the dataset.
Variable Meaning
CO(GT) True hourly averaged concentration of carbon monoxide
PT08.S1(CO) Hourly averaged sensor response
NMHC(GT) True hourly averaged overall Non Metanic HydroCar-
bons concentration
C6H6(GT) True hourly averaged Benzene concentration
PT08.S2(NMHC) Hourly averaged sensor response
NOx(GT) True hourly averaged NOx concentration
PT08.S3(NOx) Hourly averaged sensor response
NO2(GT) True hourly averaged NO2 concentration
PT08.S4(NO2) Hourly averaged sensor response
PT08.S5(O3) Hourly averaged sensor response
T Temperature
RH Relative Humidity
AH Absolute Humidity
2 Activities
This assignment focuses on two main objectives:
? Classification Task: You should develop a neural network that can
predict whether the concentration of Carbon Monoxide (CO) exceeds
a certain threshold – the mean of CO(GT) values – based on historical
air quality data. This task involves binary classification, where your
model learns to classify instances into two categories: above or below
the threshold. To determine the threshold, you must first calculate
the mean value for CO(GT), excluding unknown data (missing values).
Then, use this threshold to predict whether the value predicted by your
network is above or below it. You are free to choose and design your
own network, and there are no limitations on its structure. However,
your network should be capable of handling missing values.
2
? Regression Task: You should develop a neural network that can pre-
dict the concentration of Nitrogen Oxides (NOx) based on other air
quality features. This task involves estimating a continuous numeri-
cal value (NOx concentration) from the input features using regression
techniques. You are free to choose and design your own network and
there is no limitation on that, however, your model should be able to
deal with missing values.
In summary, the classification task aims to divide instances into two cat-
egories (exceeding or not exceeding CO(GT) threshold), while the regression
task aims to predict a continuous numerical value (NOx concentration).
2.1 Data preprocessing
It is expected you analyse the provided data and perform any required pre-
processing. Some of the tasks during preprocessing might include the ones
shown below; however, not all of them are necessary and you should evaluate
each of them against the results obtained.
(a) Identify variation range for input and output variables.
(b) Plot each variable to observe the overall behaviour of the process.
(c) In case outliers or missing data are detected correct the data accord-
ingly.
(d) Split the data for training and testing.
2.2 Design of the neural network
You should select and design neural architectures for addressing both the
classification and regression problem described above. In each case, consider
the following steps:
(a) Design the network and decide the number of layers, units, and their
respective activation functions.
(b) Remember it’s recommended your network accomplish the maximal
number of parameters Nw < (number of samples)/10.
(c) Create the neural network using Keras and TensorFlow.
3
2.3 Training
In this section, you have to train your proposed neural network. Consider
the following steps:
(a) Decide the training parameters such as loss function, optimizer, batch
size, learning rate, and episodes.
(b) Train the neural model and verify the loss values during the process.
(c) Verify possible overfitting problems.
2.4 Validating the neural model
Assess your results plotting training results and the network response for the
test inputs against the test targets. Compute error indexes to complement
the visual analysis.
(a) For the classification task, draw two different plots to illustrate your
results over different epochs. In the first plot, show the training and
validation loss over the epochs. In the second plot, show the training
and validation accuracy over the epochs. For example, Figure 1 and
Figure 2 show loss and classification accuracy plots for 100 epochs,
respectively.
Figure 1: Loss plot for the classifica-
tion task
Figure 2: Accuracy plot for the clas-
sification task
4
(b) For the classification task, compute a confusion matrix 2 including True
Positive (TP), True Negative (TN), False Positive (FP), and False Neg-
ative (FN), as shown in Table 2. Moreover, report accuracy and pre-
cision for your test data and mention the number of tested samples as
shown in Table 3 (the numbers shown in both tables are randomly cho-
sen and may not be consistent with each other). For instance, Sklearn
library offers a various range of metric functions 3, including confusion
matrix 4, accuracy, and precision. You can use Sklearn in-built met-
ric functions to calculate the mentioned metrics or develop your own
functions.
Table 2: Confusion matrix for the test data for the classification task.
Confusion Matrix Positive (Actual) Negative (Actual)
Positive (Predicted) 103 6
Negative (Predicted) 6 75
Table 3: Accuracy and precision for the test data for the classification task.
Accuracy Precision Number of Samples
CO(GT) classification 63% 60% 190
(c) For the regression task, draw two different plots to illustrate your re-
sults. In the first plot, show how the selected loss function varies for
both the training and validation through the epochs. In the second
plot, show the final estimation results for the validation test. For in-
stance, Figure 3 and Figure 4 show the loss function and the network
outputs vs the actual NOx(GT) values for a validation test, respec-
tively. In Figure 4 no data preprocessing has been performed, however,
as mentioned above, it is expected you include this in your assignment.
(d) For the regression task, report performance indexes including the Root
Mean Squared Error (RMSE), Mean Absolute Error (MAE) (see a
discussion on [2]), and the number of samples for your estimation of
2https://en.wikipedia.org/wiki/Confusion matrix
3https://scikit-learn.org/stable/api/sklearn.metrics.html
4https://scikitlearn.org/stable/modules/generated/sklearn.metrics.confusion matrix.html
5
Figure 3: Loss plot for the re-
gression task.
Figure 4: Estimated and actual NOx(GT)
for the validation set.
NOx(GT) values in a table. Root Mean Squared Error (RMSE) mea-
sures the differences between the observed values and predicted ones
and is defined as follows:
RMSE =

1
n
Σi=ni=1 (Yi ? Y?i)2, (1)
where n is the number of our samples, Yi is the actual label and Y?i
is the predicted value. In the same way, MAE can be defined as the
absolute average of errors as follows:
MAE =
1
n
Σi=ni=1 |Yi ? Y?i|. (2)
Table 4 shows an example of the performance indexes (all numbers are
randomly chosen and may not be consistent with each other). As men-
tioned before, Sklearn library offers a various range of metric functions,
including RMSE5 and MAE 6. You can use Sklearn in-built metric func-
tions to calculate the mentioned metrics or develop your own functions.
Table 4: Result table for the test data for the regression task.
RMSE MAE Number of Samples
90.60 50.35 55
5https://scikit-learn.org/stable/modules/generated/sklearn.metrics.root mean squared error.html
6https://scikit-learn.org/stable/modules/generated/sklearn.metrics.mean absolute error.html
6
3 Testing and discussing your code
As part of the assignment evaluation, your code will be tested by tutors along
with you in a discussion session carried out in the tutorial session in week 6.
The assignment has a total of 25 marks. The discussion is mandatory and,
therefore, we will not mark any assignment not discussed with tutors.
You are expected to propose and build neural models for classification
and regression tasks. The minimal output we expect to see are the results
mentioned above in Section 2.4. You will receive marks for each of these
subsection as shown in Table 5, i.e. 7 marks in total. However, it’s fine if
you want to include any other outcome to highlight particular aspects when
testing and discussing your code with your tutor.
For marking your results, you should be prepared to simulate your neural
model with a generalisation set we have saved apart for that purpose. You
must anticipate this by including in your submission a script ready to open
a file (with the same characteristics as the given dataset but with fewer data
points), simulate the network, and perform all the validation tests described
in Section 2.4 (b) and (d) (accuracy, precision, RMSE, MAE). It is recom-
mended to save all of your hyper-parameters and weights (your model in
general) so you can call your network and perform the analysis later in your
discussion session.
As for the classification task, you need to compute accuracy and precision,
while for the regression task RMSE and MAE using the generalisation set.
You will receive 3 marks for each task, given successful results. Expected
results should be as follows:
? For the classification task, your network should achieve at least 85%
accuracy and precision. Accuracy and precision lower than that will
result in a score of 0 marks for that specific section.
? For the regression task, it is expected to achieve an RMSE of at most
280 and an MAE of 220 for unseen data points. Errors higher than the
mentioned values will be marked as 0 marks.
Finally, you will receive 1 mark for code readability for each task, and
your tutor will also give you a maximum of 5 marks for each task depending
on the level of code understanding as follows: 5. Outstanding, 4. Great,
3. Fair, 2. Low, 1. Deficient, 0. No answer.
7
Table 5: Marks for each task.
Task Marks
Results obtained with given dataset
Loss and accuracy plots for classification task 2 marks
Confusion matrix and accuracy and precision tables for classifi-
cation task
2 marks
Loss and estimated NOx(GT) plots for regression task 2 marks
Performance indexes table for regression task 1 mark
Results obtained with generalisation dataset
Accuracy and precision for classification task 3 marks
RMSE and MAE for regression task 3 marks
Code understanding and discussion
Code readability for classification task 1 mark
Code readability for regression task 1 mark
Code understanding and discussion for classification task 5 mark
Code understanding and discussion for regression task 5 mark
Total marks 25 marks
4 Submitting your assignment
The assignment must be done individually. You must submit your assignment
solution by Moodle. This will consist of a single .ipynb Jupyter file. This file
should contain all the necessary code for reading files, data preprocessing,
network architecture, and result evaluations. Additionally, your file should
include short text descriptions to help markers better understand your code.
Please be mindful that providing clean and easy-to-read code is a part of
your assignment.
Please indicate your full name and your zID at the top of the file as a
comment. You can submit as many times as you like before the deadline –
later submissions overwrite earlier ones. After submitting your file a good
practice is to take a screenshot of it for future reference.
Late submission penalty: UNSW has a standard late submission
penalty of 5% per day from your mark, capped at five days from the as-
sessment deadline, after that students cannot submit the assignment.
8
5 Deadline and questions
Deadline: Week 5, Wednesday 26 June of June 2024, 11:55pm. Please
use the forum on Moodle to ask questions related to the project. We will
prioritise questions asked in the forum. However, you should not share your
code to avoid making it public and possible plagiarism. If that’s the case,
use the course email cs9414@cse.unsw.edu.au as alternative.
Although we try to answer questions as quickly as possible, we might take
up to 1 or 2 business days to reply, therefore, last-moment questions might
not be answered timely.
6 Plagiarism policy
Your program must be entirely your own work. Plagiarism detection software
might be used to compare submissions pairwise (including submissions for
any similar projects from previous years) and serious penalties will be applied,
particularly in the case of repeat offences.
Do not copy from others. Do not allow anyone to see your code.
Please refer to the UNSW Policy on Academic Honesty and Plagiarism if you
require further clarification on this matter.
References
[1] De Vito, S., Massera, E., Piga, M., Martinotto, L. and Di Francia, G.,
2008. On field calibration of an electronic nose for benzene estimation in an
urban pollution monitoring scenario. Sensors and Actuators B: Chemical,
129(2), pp.750-757.
[2] Hodson, T. O. 2022. Root mean square error (RMSE) or mean absolute
error (MAE): When to use them or not. Geoscientific Model Development
Discussions, 2022, 1-10.

請(qǐng)加QQ:99515681  郵箱:99515681@qq.com   WX:codinghelp













 

標(biāo)簽:

掃一掃在手機(jī)打開(kāi)當(dāng)前頁(yè)
  • 上一篇:代寫(xiě)指標(biāo)編寫(xiě) 編寫(xiě)同花順指標(biāo)公式 代編公式
  • 下一篇:ECON2101代做、代寫(xiě)Python/c++設(shè)計(jì)編程
  • CMT219代寫(xiě)、代做Java程序語(yǔ)言
  • 代做MATH1033、代寫(xiě)c/c++,Java程序語(yǔ)言
  • 代做CSCI 2525、c/c++,Java程序語(yǔ)言代寫(xiě)
  • COMP 315代寫(xiě)、Java程序語(yǔ)言代做
  • 昆明生活資訊

    昆明圖文信息
    蝴蝶泉(4A)-大理旅游
    蝴蝶泉(4A)-大理旅游
    油炸竹蟲(chóng)
    油炸竹蟲(chóng)
    酸筍煮魚(yú)(雞)
    酸筍煮魚(yú)(雞)
    竹筒飯
    竹筒飯
    香茅草烤魚(yú)
    香茅草烤魚(yú)
    檸檬烤魚(yú)
    檸檬烤魚(yú)
    昆明西山國(guó)家級(jí)風(fēng)景名勝區(qū)
    昆明西山國(guó)家級(jí)風(fēng)景名勝區(qū)
    昆明旅游索道攻略
    昆明旅游索道攻略
  • NBA直播 短信驗(yàn)證碼平臺(tái) 幣安官網(wǎng)下載 歐冠直播 WPS下載

    關(guān)于我們 | 打賞支持 | 廣告服務(wù) | 聯(lián)系我們 | 網(wǎng)站地圖 | 免責(zé)聲明 | 幫助中心 | 友情鏈接 |

    Copyright © 2025 kmw.cc Inc. All Rights Reserved. 昆明網(wǎng) 版權(quán)所有
    ICP備06013414號(hào)-3 公安備 42010502001045

    狠狠综合久久久久综合网址-a毛片网站-欧美啊v在线观看-中文字幕久久熟女人妻av免费-无码av一区二区三区不卡-亚洲综合av色婷婷五月蜜臀-夜夜操天天摸-a级在线免费观看-三上悠亚91-国产丰满乱子伦无码专区-视频一区中文字幕-黑人大战欲求不满人妻-精品亚洲国产成人蜜臀av-男人你懂得-97超碰人人爽-五月丁香六月综合缴情在线
  • <dl id="akume"></dl>
  • <noscript id="akume"><object id="akume"></object></noscript>
  • <nav id="akume"><dl id="akume"></dl></nav>
  • <rt id="akume"></rt>
    <dl id="akume"><acronym id="akume"></acronym></dl><dl id="akume"><xmp id="akume"></xmp></dl>
    黄色在线视频网| 精品一区二区中文字幕| 亚洲狼人综合干| 欧美少妇一区二区三区| 日韩av片专区| 手机av在线免费| 99色这里只有精品| 欧美色图另类小说| 波多野结衣网页| 天天爽天天爽夜夜爽| 波多野结衣乳巨码无在线| 免费av观看网址| 中文字幕亚洲乱码| 国产夫妻自拍一区| 国产野外作爱视频播放| 神马午夜伦理影院| 一二三av在线| 亚洲一区二区在线视频观看| a级网站在线观看| 无码无遮挡又大又爽又黄的视频| 国产91视频一区| 日韩av.com| 日韩中文字幕组| 国产视频一区二区三区在线播放| www.亚洲成人网| 国产91视频一区| 欧美一区二区三区综合| 九九久久久久久| 男女激情免费视频| 国产精品裸体瑜伽视频| 亚洲少妇第一页| 天天干天天av| 国产a级片免费看| 大地资源网在线观看免费官网| 国产精品久久久毛片| 国产综合免费视频| 一二三av在线| 欧美精品自拍视频| 国产精品久久久久9999小说| 国产九色porny| 亚洲高清免费在线观看| 国产大尺度在线观看| 97精品国产97久久久久久粉红| 国产美女主播在线| 黄色小视频免费网站| 三上悠亚久久精品| 4444在线观看| 999这里有精品| 爆乳熟妇一区二区三区霸乳| 毛片毛片毛片毛片毛| 成人毛片一区二区| 日韩精品福利片午夜免费观看| 日韩精品一区二区在线视频 | 国产福利影院在线观看| 午夜av中文字幕| 国产不卡一区二区视频| 九九九在线观看视频| 国产精品专区在线| 亚洲色欲久久久综合网东京热| 五月天开心婷婷| 国产小视频精品| 亚洲综合激情视频| www.xxx亚洲| а 天堂 在线| 国产精品无码电影在线观看| 欧美特级aaa| 色婷婷成人在线| 天天操狠狠操夜夜操| 777久久久精品一区二区三区| 精品久久一二三| 国产日韩成人内射视频| 蜜臀av免费观看| 最新视频 - x88av| 男人天堂网视频| 欧美视频在线播放一区| 日韩精品一区二区免费| 99精品免费在线观看| 日韩久久一级片| 久久久久久久久久久久久国产| 青青在线免费视频| 中文字幕 日韩 欧美| www.日本在线播放| jizz18女人| 妺妺窝人体色www在线小说| 精品视频免费在线播放| 91看片在线免费观看| 欧美另类videos| 一个色综合久久| 色七七在线观看| 97超碰青青草| 久久综合久久色| 日本www在线播放| 欧美黄色免费网址| 狠狠干视频网站| 中文字幕乱码免费| 亚洲综合20p| 国产免费色视频| 奇米影视亚洲色图| 91传媒久久久| 成人免费无码av| 网站一区二区三区| 九九久久久久久| 国产日韩欧美大片| 欧美激情精品久久久久久小说| 一区二区三区欧美精品| www.国产福利| 91看片淫黄大片91| 日本a级片免费观看| 久久久久久久久久久久91| 亚洲天堂国产视频| 免费视频爱爱太爽了| 成人免费观看毛片| 樱空桃在线播放| 北条麻妃在线一区| 艳母动漫在线免费观看| 91九色在线观看视频| 国产午夜伦鲁鲁| 亚洲免费成人在线视频| 国产特级淫片高清视频| 伊人五月天婷婷| 亚洲无吗一区二区三区| 欧美中文字幕在线观看视频 | 欧美视频亚洲图片| 丰满女人性猛交| 色综合五月婷婷| 无码人妻h动漫| 久久精品午夜福利| 中文字幕黄色大片| 精品久久久久久久无码| 影音先锋成人资源网站| 久久久久免费精品| 亚洲成人动漫在线| 国产高清视频网站| 男人用嘴添女人下身免费视频| 手机在线视频一区| 欧美一级免费在线| 久久久久久久少妇| 欧美老熟妇喷水| 午夜免费福利网站| 亚洲免费999| 一区二区三区入口| 99re6在线观看| 欧美性猛交xxxx乱大交91| wwwwxxxx日韩| 国产 porn| 欧美精品一区二区性色a+v| www婷婷av久久久影片| 国产91沈先生在线播放| 免费观看亚洲视频| 国产免费内射又粗又爽密桃视频| 伊人五月天婷婷| 精品一区二区三区无码视频| www插插插无码视频网站| 日韩video| 国内自拍在线观看| 欧美一级特黄a| 性做爰过程免费播放| 日韩欧美国产免费| 国产色视频在线播放| 国产精品v日韩精品v在线观看| 欧美三级理论片| 亚洲第一中文av| 一级性生活视频| 亚洲一级片网站| 亚洲视频第二页| 1024av视频| 五月激情五月婷婷| 成人小视频在线观看免费| 亚洲这里只有精品| www.99av.com| 六月婷婷激情综合| 国产999免费视频| 浓精h攵女乱爱av| 中文字幕天天干| 国产无限制自拍| 日本福利视频一区| 日本在线一二三区| 不卡的av中文字幕| 男女av免费观看| 在线观看高清免费视频| 男女男精品视频站| 日韩成人精品视频在线观看| 亚洲精品高清无码视频| 99在线精品免费视频| 日本五级黄色片| 国产精品沙发午睡系列| 久久亚洲中文字幕无码| 超碰免费在线公开| 妺妺窝人体色www看人体| 无码粉嫩虎白一线天在线观看| 日本道在线视频| 日本一区午夜艳熟免费| 男人日女人bb视频| 成人黄色av片| 九九九九九九九九| 久久久久久www| 99爱视频在线| 欧美交换配乱吟粗大25p| 一女被多男玩喷潮视频| 欧美又黄又嫩大片a级|