狠狠综合久久久久综合网址-a毛片网站-欧美啊v在线观看-中文字幕久久熟女人妻av免费-无码av一区二区三区不卡-亚洲综合av色婷婷五月蜜臀-夜夜操天天摸-a级在线免费观看-三上悠亚91-国产丰满乱子伦无码专区-视频一区中文字幕-黑人大战欲求不满人妻-精品亚洲国产成人蜜臀av-男人你懂得-97超碰人人爽-五月丁香六月综合缴情在线

代做COMP9024、代寫C++設(shè)計(jì)編程

時(shí)間:2024-04-15  來源:  作者: 我要糾錯(cuò)



COMP9024 24T1 - Assignment
1/6
COMP9024 24T1 Assignment
The Missing Pages
Data Structures and Algorithms
Change Log
We may make minor changes to the spec to address/clarify some outstanding issues. These may require minimal changes in your design/code, if at all.
Students are strongly encouraged to check the online forum discussion and the change log regularly.
Version 1.0
(2024-03-15 17:00)
Initial release.
Background
As we have mentioned in lectures, the Internet can be thought of as a graph (a very large graph). Web pages represent vertices and hyperlinks represent
directed edges.
With almost 1.1 billion unique websites (as of February 2024), and each website having multiple webpages, and each webpage having multiple hyperlinks, it
can understandably be a very difficult job to remember the URL of every website you want to visit.
In order to make life easier, from the very early days of the internet, there have been search engines that can be used to find websites.
But the job of a search engine is very difficult: First it must index (create a representation of) the entire (or as close to it as possible) World Wide Web. Next it must rank the webpages it finds.
In this assignment we will be implementing algorithms to solve each of these problems, and figure out the fastest way to navigate from one page to another.
1. To index the internet we will be creating a web crawler.
2. To rank webpages we will implement the PageRank algorithm.
3. To find the shortest path between two pages we will implement Dijkstra's algorithm
The Assignment
Starter Files
Download this zip file.
Unzipping the file will create a directory called 'assn' with all the assignment start-up files.
Alternatively, you can achieve the same thing from a terminal with commands such as:
prompt$ curl https://www.cse.unsw.edu.au/~cs9024/24T1/assn/assn.zip -o assn.zip
prompt$ unzip assn.zip -d assn
The first command will download assn.zip into the current working directory, then the second command will extract it into a sub-directory assn.
You can also make note of the following URLs:
http://www.cse.unsw.edu.au/~cs9024/micro-web
http://www.cse.unsw.edu.au/~cs9024/mini-web
Here is a visual representation of the micro-web:
Once you read the assignment specification, hopefully it will be clear to you how these URLs might be useful. You may also find it useful to construct a similar
visual representation for the mini-web.
Overall File Structure
Below is a reference for each file and their purpose.
Note: You cannot modify ANY of the header (.h) files.
Provided File Description Implemented In
crawler.c A driver program to crawl the web   
dijkstra.h Interface for the Shortest Path functions (Subset 4) graph.c
graph.h Interface for the Graph ADT (Subset 1b) graph.c
list.h Interface for the List ADT (Subset 1a) list.c
Makefile A build script to compile the crawler into an executable   
pagerank.h Interface for the PageRank functions (Subset 3) graph.c
COMP9024 24T1 - Assignment
2/6
Your task will be to provide the necessary implementations to complete this project.
Subset 1 - Dependencies
Before we can start crawling we need to be able to store our crawled data. As the internet is a Graph, this means we need a Graph ADT. We will also need a Set
ADT and one of a Queue ADT or a Stack ADT, in order to perform web scraping (for a BFS or DFS).
Subset 1a - Implement the List (Queue, Stack, Set) ADT
You have been provided with a file list.h. Examine the file carefully. It provides the interface for an ADT that will provide Queue, Stack, and Set functionality.
Your task is to implement the functions prototyped in the list.h header file within list.c.
You must create the file list.c to implement this ADT.
You must store string (char *) data within the ADT.
You must allocate memory dynamically.
You must not modify the list.h file.
You must not modify the function prototypes declared in the list.h file.
You may add utility functions to the list.c file.
You may use the string.h library, and other standard libraries from the weekly exercises.
You may reuse code previously submitted for weekly assessments and provided in the lectures.
You may use whatever internal representation you like for your list ADT, provided it does not contradict any of the above.
You may assume that any instance of your list ADT will only be used as a queue or a stack or a set.
You should write programs that use your ADT to test and debug your code.
You should use valgrind to verify that your ADT does not leak memory.
As a reminder:
Queue - First In, First Out
Stack - First In, Last Out
Set - Only stores unique values.
See list.h for more information about each function that you are required to implement.
Testing
We have created a script to automatically test your list ADT. It expects to find list.c in the current working directory. Limited test cases are provided, so you
should always do your own, more thorough, testing.
prompt$ 9024 dryrun assn_list
Subset 1b - Implement the Graph ADT
You have been provided with a file graph.h. Examine the file carefully. It provides the interface for an ADT that will provide Graph functionality. The graph is
both weighted and directed.
Your task is to implement the functions prototyped in the graph.h header file within graph.c.
You must create the file graph.c to implement this ADT.
You must use an adjacency list representation, but the exact representation is up to you.
You must use string (char *) data to label the vertices.
You must allocate memory dynamically.
You must not modify the graph.h file.
You must not modify the function prototypes declared in the graph.h file.
You may add utility functions to the graph.c file.
You may use the string.h library, and other standard libraries from the weekly exercises.
You may reuse code previously submitted for weekly assessments and provided in the lectures.
You should write programs that use your ADT to test and debug your code.
You should use valgrind to verify that your ADT does not leak memory.
See graph.h for more information about each function that you are required to implement.
Subset 2 - Web Crawler
We are now going to use the list and graph ADTs you have created to implement a web crawler.
Assuming your ADTs are implemented correctly, you should be able to compile the crawler using the provided build script:
prompt$ make crawler
Note: crawler.c requires external dependencies (libcurl and libxml2). The provided Makefile will work on CSE servers (ssh and vlab), but may not
work on your home computer.
After running the executable, check that the output aligns with the navigation of the sample website.
Carefully examine the code in crawler.c. Uncomment the block of code that uses scanf to take user input for the ignore_list.
The ignore list represents the URLs that we would like to completely ignore when we are calculating PageRanks, as if they did not exist in the graph. This means that any incoming and outgoing links from these URLs are treated as non-existent. You are required to implement this functionality locally - within the
graph_show function - and NOT change the representation of the actual graph strcuture within the ADT. For further details see the graph.h file.
If you have correctly implemented the ADTs from the previous tasks, this part should be mostly free.
crawler.c is a complete implementation of a web crawler; you do not need to modify the utility functions, only the bottom part of the main function. However,
you should look at the program carefully and understand it well so that you can use it (i.e., modify it appropriately) for later tasks.
Sample Output
COMP9024 24T1 - Assignment
3/6
Using a modified crawler.c that simply calls graph_show on the micro-web, and without ignoring any pages, the output should be:
prompt$ ./crawler http://www.cse.unsw.edu.au/~cs9024/micro-web/index.html
Enter a page to ignore or type 'done': done
http://www.cse.unsw.edu.au/~cs9024/micro-web/index.html
http://www.cse.unsw.edu.au/~cs9024/micro-web/X.html
http://www.cse.unsw.edu.au/~cs9024/micro-web/Y.html
http://www.cse.unsw.edu.au/~cs9024/micro-web/Z.html
http://www.cse.unsw.edu.au/~cs9024/micro-web/index.html http://www.cse.unsw.edu.au/~cs9024/micro-web/X.html 1
http://www.cse.unsw.edu.au/~cs9024/micro-web/index.html http://www.cse.unsw.edu.au/~cs9024/micro-web/Y.html 1
http://www.cse.unsw.edu.au/~cs9024/micro-web/index.html http://www.cse.unsw.edu.au/~cs9024/micro-web/Z.html 1
http://www.cse.unsw.edu.au/~cs9024/micro-web/X.html http://www.cse.unsw.edu.au/~cs9024/micro-web/index.html 1
http://www.cse.unsw.edu.au/~cs9024/micro-web/Y.html http://www.cse.unsw.edu.au/~cs9024/micro-web/index.html 1
prompt$
Now let's add index.html to the ignore list:
prompt$ ./crawler http://www.cse.unsw.edu.au/~cs9024/micro-web/index.html
Enter a page to ignore or type 'done': http://www.cse.unsw.edu.au/~cs9024/micro-web/index.html
Enter another page to ignore or type 'done': done
http://www.cse.unsw.edu.au/~cs9024/micro-web/X.html
http://www.cse.unsw.edu.au/~cs9024/micro-web/Y.html
http://www.cse.unsw.edu.au/~cs9024/micro-web/Z.html
prompt$
All traces of index.html have been removed. This means that only the remaining vertices are displayed as there are no longer any edges. Note that the order of
the output matters. It should follow the BFS that is performed by the crawler. If your result does not follow this order, you will be marked as incorrect, even if your
graph is valid.
Testing
We have created a script to automatically test your list and graph ADTs. It expects to find list.c and graph.c in the current working directory. Limited test
cases are provided, so you should always do your own, more thorough, testing.
prompt$ 9024 dryrun assn_crawler
Subset 3 - PageRank
Background
Now that we can crawl a web and build a graph, we need a way to determine which pages (i.e. vertices) in our web are important.
We haven't kept page content so the only metric we can use to determine the importance of a page is to check how much other pages rely on its existence. That
is, how easy is it to follow a sequence of one or more links (edges) and end up on the page.
In 1998, Larry Page and Sergey Brin (a.k.a. Google), created the PageRank algorithm to evaluate this metric.
Google still uses the PageRank algorithm to score every page it indexes on the internet to help order its search results.
Task
In graph.c implement the two new functions graph_pagerank and graph_show_pagerank.
First, graph_pagerank should calculate and store the PageRank of each vertex (i.e. page) in the graph.
The algorithm must exclude the URLs that are provided in an 'ignore list' to the function. Do not remove the pages from the graph, only skip (i.e., ignore) them
from calculations. This means that you will need to understand which parts of the PageRank algorithm need to be modified.
Using the ignore list, you will be able to see what happens to the PageRanks as certain pages are removed. What should happen to the PageRank of a
particular page if you remove all pages linking to it?
Second, graph_show_pagerank should print the PageRank of every vertex (i.e. page) in the graph that is NOT in the ignore list.
Pages (vertices) should be printed from highest to lowest rank, based on their rounded (to 3 d.p.) rank. You should use the round function from the math.h
library. If two pages have the same rounded rank then they should be printed lexiographically.
You may add more utility functions to graph.c.
You may (and most likely will need to) modify your struct definitions in graph.c.
You must not modify the file graph.h.
You must not modify the file pagerank.h.
You must not modify the function prototypes for graph_pagerank and graph_show_pagerank.
Algorithm
For :
for :
Where:
is the number of vertices
and are each some vertex
is the "time" (iteration count)
t = 0
PR(pi;t) =
1
N
t > 0
PR(pi;t) =
1 ? d
N
+ d    ((   
pj  M(pi)
PR(pj;t ? 1)
D(pj)
) + (  
pj  S
PR(pj;t ? 1)
N
))
N
pi pj
t
COMP9024 24T1 - Assignment
4/6
is the PageRank of vertex at some time
is the damping_factor
is the set of vertices that have an outbound edge towards
is the PageRank of vertex at some time
is the degree of vertex , ie. the number of outbound edges of vertex
is the set of sinks, ie. the set of vertices with no outbound edges, ie. where is 0
This formula is equivalent to the following algorithm:
procedure graph_pagerank(G, damping_factor, epsilon)
N = number of vertices in G
for all V in vertices of G
oldrank of V = 0
pagerank of V = 1 / N
end for
while |pagerank of V - oldrank of V| of any V in vertices of G > epsilon
for all V in vertices of G
oldrank of V = pagerank of V
end for
sink_rank = 0
for all V in vertices of G that have no outbound edges
sink_rank = sink_rank + (damping_factor * (oldrank of V / N))
end for
for all V in vertices of G
pagerank of V = sink_rank + ((1 - damping_factor) / N)
for all I in vertices of G that have an edge from I to V
pagerank of V = pagerank of V + ((damping_factor * oldrank of I) / number of outbound edges from I)
end for
end for
end while
end procedure
In order to test your PageRank functions, you should modify crawler.c to #include "pagerank.h", and change the last part of the main function to
something like:
...
graph_show(network, stdout, ignore_list);
graph_pagerank(network, damping, epsilon, ignore_list);
graph_show_pagerank(network, stdout, ignore_list);
list_destroy(ignore_list);
graph_destroy(network);
where you choose appropriate values for damping and epsilon.
Again, it is noted that the changes you make to crawler.c are purely for you to test whether your PageRank functions are working. We will use a different
crawler.c for testing your PageRank functions.
Sample Output
Here we're using a modified crawler.c that calculates graph_pagerank and prints graph_show_pagerank. Damping has been set to 0.85 and epsilon to
0.00001. For the micro-web, and without ignoring any pages, the output should be:
prompt$ ./crawler http://www.cse.unsw.edu.au/~cs9024/micro-web/index.html
Enter a page to ignore or type 'done': done
http://www.cse.unsw.edu.au/~cs9024/micro-web/index.html: 0.412
http://www.cse.unsw.edu.au/~cs9024/micro-web/X.html: 0.196
http://www.cse.unsw.edu.au/~cs9024/micro-web/Y.html: 0.196
http://www.cse.unsw.edu.au/~cs9024/micro-web/Z.html: 0.196
prompt$
Now let's add index.html to the ignore list:
prompt$ ./crawler http://www.cse.unsw.edu.au/~cs9024/micro-web/index.html
Enter a page to ignore or type 'done': http://www.cse.unsw.edu.au/~cs9024/micro-web/index.html
Enter another page to ignore or type 'done': done
http://www.cse.unsw.edu.au/~cs9024/micro-web/X.html: 0.333
http://www.cse.unsw.edu.au/~cs9024/micro-web/Y.html: 0.333
http://www.cse.unsw.edu.au/~cs9024/micro-web/Z.html: 0.333
prompt$
X.html, Y.html and Z.html have no connections anymore and as such have the same ranks. Note that the sum is still (approximately) equal to 1, and N, the
number of vertices, is equal to 3 in this case, since there were a total of 4 nodes originally, and 1 of the nodes has been ignored.
Testing
We have created a script to automatically test your PageRank functions. It expects to find list.c and graph.c in the current working directory. Limited test
cases are provided, so you should always do your own, more thorough, testing.
prompt$ 9024 dryrun assn_rankings
Subset 4 - Degrees of Separation (Shortest Path)
In graph.c, implement the two functions prototyped in dijkstra.h: graph_shortest_path and graph_show_path.
First, graph_shortest_path should calculate the shortest path between a source vertex and all other vertices.
graph_shortest_path should use Dijkstra's algorithm to do so.
PR(pi;t) pi t
d
M(pi) M(pi)
PR(pj
;t ? 1) pj t ? 1
D(pj) pj pj
S D(pj)
COMP9024 24T1 - Assignment
5/6
Note that an ignore list is also passed to graph_shortest_path. Similar to above, you will need to ensure these URLs are treated as non-existent. For
example if there was a path A->B->C, but B is ignored, then there is no path from A to C.
Unlike a regular implementation of Dijkstra's algorithm, your code should minimise the number of edges in the path (not minimise the total weight of the path -
consider each edge's weight to be 1).
Second, graph_show_path should print the path from the previously given source vertex to a given destination vertex. With the ignore list, there can be no
path between two vertices. In this case, output nothing.
You may add more utility functions to graph.c.
You may (and most likely will need to) extend your struct definitions in graph.c.
You must not modify the file dijkstra.h.
You must not modify the file pagerank.h.
You must not modify the file graph.h.
You must not modify the function prototypes for graph_shortest_path and graph_show_path.
In order to test your Dijkstra functions, you should modify crawler.c to #include "dijkstra.h", and change the last part of the main function to
something like:
...
graph_show(network, stdout, ignore_list);
graph_shortest_path(network, argv[1], ignore_list);
char destination[BUFSIZ];
printf("destination: ");
scanf("%s", destination);
graph_show_path(network, stdout, destination, ignore_list);
list_destroy(ignore_list);
graph_destroy(network);
The changes you make to crawler.c are purely for you to test whether your Dijkstra functions are working. We will use a different crawler.c for testing your
Dijkstra functions.
Sample Output
Using a modified crawler.c that accepts a source page as a command line argument from which to calculate graph_shortest_path, and a destination
page to output graph_show_path, for the micro-web, and without ignoring any pages, the output in tracing a path from X.html to Z.html should be:
prompt$ ./crawler http://www.cse.unsw.edu.au/~cs9024/micro-web/index.html http://www.cse.unsw.edu.au/~cs9024/micro-web/
destination: http://www.cse.unsw.edu.au/~cs9024/micro-web/Z.html
Enter a page to ignore or type 'done': done
http://www.cse.unsw.edu.au/~cs9024/micro-web/X.html
-> http://www.cse.unsw.edu.au/~cs9024/micro-web/index.html
-> http://www.cse.unsw.edu.au/~cs9024/micro-web/Z.html
prompt$
Now let's add index.html to the ignore list:
prompt$ ./crawler http://www.cse.unsw.edu.au/~cs9024/micro-web/index.html http://www.cse.unsw.edu.au/~cs9024/micro-web/
destination: http://www.cse.unsw.edu.au/~cs9024/micro-web/Z.html
Enter a page to ignore or type 'done': http://www.cse.unsw.edu.au/~cs9024/micro-web/index.html
Enter another page to ignore or type 'done': done
prompt$
Since index.html has been ignored, the path cannot be completed and nothing is returned. Your algorithm should iterate vertices/pages in the same order as the
crawler. This ensures that when your algorithm finds the shortest path, it will return the first path it would encounter from the BFS in the crawler. If your result
does not follow this order, you will be marked as incorrect, even if your path is valid.
Testing
We have created a script to automatically test your shortest path algorithm. It expects to find list.c and graph.c in the current working directory. Limited test
cases are provided, so you should always do your own, more thorough, testing.
prompt$ 9024 dryrun assn_path
Assessment
Due Date
Wednesday, 17 April, 11:59:59.
Late Penalty:
The UNSW standard late penalty for assessment is 5% per day for 5 days - this is implemented hourly for this assignment.
Each hour your assignment is submitted late reduces its mark by 0.2%.
For example, if an assignment worth 60% was submitted 10 hours late, it would be awarded 58.8%.
Beware - submissions more than 5 days late will not be accepted and will receive zero marks. This again is the UNSW standard assessment policy.
Submission
You should submit your list.c and graph.c files using the following give command:
prompt$ give cs9024 assn list.c graph.c
Alternatively, you can select the option to "Make Submission" at the top of this page to submit directly through WebCMS3.
COMP9024 24T1 - Assignment
6/6
Important notes:
Make sure you spell all filenames correctly.
You can run give multiple times. Only your last submission will be marked.
Ensure both files are submitted together. If you separate them across multiple submissions, each submission will replace the previous one. Whether you submit through the command line or WebCMS3, it is your responsibility to ensure it reports a successful submission. Failure to submit
correctly will not be considered as an excuse.
You cannot obtain marks by e-mailing your code to tutors or lecturers.
Assessment Scheme
This assignment will contribute 12 marks to your final COMP9024 mark.
11 marks will come from automated testing, and 1 mark will come from manual inspection of your code.
The specific breakdown of marks is as follows:
Description Marks
List ADT 3
Graph ADT 3
PageRank 2
Shortest Path 2
Memory Management 1
Code Quality 1
Total 12
Important:
Any submission that does not allow us to follow the aforementioned marking procedure "normally" (e.g., missing files, compile or run-time errors) may
result in delays in marking your submission. Depending on the severity of the errors/problems, we may ask you to resubmit (with max late penalty) or
assess your written code instead (e.g., for some "effort" marks only).
Ensure your submitted code compiles on a CSE machine using the standard options -Wall -Werror.
Memory management will be assessed using valgrind. You may refer to the Week 4 Practical for guidance on how you can compile your code and run it
through valgrind. Note, this will require you to write some sort of "driver" or "test" program for your ADT.
Code quality will be assessed on:
Readability - your code is generally easy to understand, follows typical spacing and indentation, and uses a consistent style.
Documentation - your code is documented in places where it is harder to understand.
While you are not required to follow it, you may refer to the CSE C Coding Style Guide.
Collection
Once marking is complete you can collect your submission using the following command:
prompt$ 9024 classrun -collect assn
You can also view your marks using the following command:
prompt$ 9024 classrun -sturec
You can also collect your submission directly through WebCMS3 from the "Collect Submission" tab at the top of this page.
Plagiarism
Group submissions will not be allowed. Your programs must be entirely your own work. Plagiarism detection software will be used to compare all submissions
pairwise (including submissions for similar assessments in previous years, if applicable) and serious penalties will be applied, including an entry on UNSW's
plagiarism register.
Do not copy ideas or code from others
Do not use a publicly accessible repository or allow anyone to see your code
Please refer to the on-line sources to help you understand what plagiarism is and how it is dealt with at UNSW:
Plagiarism and Academic Integrity
UNSW Plagiarism Policy Statement
UNSW Plagiarism Procedure
Copyright
Reproducing, publishing, posting, distributing or translating this assignment is an infringement of copyright and will be referred to UNSW Student Conduct and
Integrity for action.

請(qǐng)加QQ:99515681  郵箱:99515681@qq.com   WX:codinghelp






 

標(biāo)簽:

掃一掃在手機(jī)打開當(dāng)前頁(yè)
  • 上一篇:代寫 CS6114 Coding Video for Streaming
  • 下一篇:COMP3310代做、代寫C++, Java/Python編程
  • 無相關(guān)信息
    昆明生活資訊

    昆明圖文信息
    蝴蝶泉(4A)-大理旅游
    蝴蝶泉(4A)-大理旅游
    油炸竹蟲
    油炸竹蟲
    酸筍煮魚(雞)
    酸筍煮魚(雞)
    竹筒飯
    竹筒飯
    香茅草烤魚
    香茅草烤魚
    檸檬烤魚
    檸檬烤魚
    昆明西山國(guó)家級(jí)風(fēng)景名勝區(qū)
    昆明西山國(guó)家級(jí)風(fēng)景名勝區(qū)
    昆明旅游索道攻略
    昆明旅游索道攻略
  • NBA直播 短信驗(yàn)證碼平臺(tái) 幣安官網(wǎng)下載 歐冠直播 WPS下載

    關(guān)于我們 | 打賞支持 | 廣告服務(wù) | 聯(lián)系我們 | 網(wǎng)站地圖 | 免責(zé)聲明 | 幫助中心 | 友情鏈接 |

    Copyright © 2025 kmw.cc Inc. All Rights Reserved. 昆明網(wǎng) 版權(quán)所有
    ICP備06013414號(hào)-3 公安備 42010502001045

    狠狠综合久久久久综合网址-a毛片网站-欧美啊v在线观看-中文字幕久久熟女人妻av免费-无码av一区二区三区不卡-亚洲综合av色婷婷五月蜜臀-夜夜操天天摸-a级在线免费观看-三上悠亚91-国产丰满乱子伦无码专区-视频一区中文字幕-黑人大战欲求不满人妻-精品亚洲国产成人蜜臀av-男人你懂得-97超碰人人爽-五月丁香六月综合缴情在线
  • <dl id="akume"></dl>
  • <noscript id="akume"><object id="akume"></object></noscript>
  • <nav id="akume"><dl id="akume"></dl></nav>
  • <rt id="akume"></rt>
    <dl id="akume"><acronym id="akume"></acronym></dl><dl id="akume"><xmp id="akume"></xmp></dl>
    www.中文字幕在线| 欧美日韩在线观看不卡| 强开小嫩苞一区二区三区网站| 国产三级三级看三级| 超碰影院在线观看| 国产精品拍拍拍| 午夜剧场高清版免费观看| 日韩欧美黄色大片| 污污污污污污www网站免费| 欧美极品少妇无套实战| 免费看黄色a级片| 国产白丝袜美女久久久久| 午夜啪啪福利视频| 少妇人妻无码专区视频| www.国产在线视频| 亚洲不卡中文字幕无码| 欧美成人免费高清视频| 黄大色黄女片18第一次| 精品久久免费观看| www.成年人视频| 玩弄japan白嫩少妇hd| 黄色高清无遮挡| 拔插拔插华人永久免费| 欧美在线观看视频免费| 欧美一区二区三区爽大粗免费| 国产日韩一区二区在线观看| 成年人三级黄色片| 日韩在线观看a| 91激情视频在线| mm131午夜| 久久人妻精品白浆国产| 日韩av加勒比| 精品无码一区二区三区爱欲| 免费日韩中文字幕| 女女同性女同一区二区三区按摩| 国产传媒久久久| 精品国产一二三四区| www.精品在线| 一本久道高清无码视频| 一本大道熟女人妻中文字幕在线| 国产一级免费大片| 国产av麻豆mag剧集| 性欧美在线视频| 无码专区aaaaaa免费视频| 日韩肉感妇bbwbbwbbw| 超碰超碰在线观看| heyzo亚洲| 大桥未久一区二区三区| av免费观看大全| 精品国产乱码久久久久久1区二区| 久久久久久久久久网| 超碰影院在线观看| 91成人综合网| 亚洲第一页在线视频| 波多野结衣家庭教师视频| 公共露出暴露狂另类av| 久草在在线视频| 欧美视频免费看欧美视频| 日本一二三区在线| 妺妺窝人体色www在线观看| 一本久道高清无码视频| 日韩欧美xxxx| 亚洲精品手机在线观看| 午夜精品久久久内射近拍高清| av高清在线免费观看| 妞干网在线观看视频| 大伊香蕉精品视频在线| 久久综合久久久久| 国产欧美精品aaaaaa片| 国产精品视频一二三四区| 韩国无码av片在线观看网站| 玖玖精品在线视频| 91网站在线观看免费| 免费看欧美一级片| 九九久久九九久久| 久久手机在线视频| 国产精品久久久久久久乖乖| 国产av人人夜夜澡人人爽麻豆 | 国产成人精品无码播放| 人妻内射一区二区在线视频| 国产免费视频传媒| 免费看污污网站| 中文字幕一区二区在线观看视频| 亚洲欧美手机在线| 黄色www在线观看| 菠萝蜜视频在线观看入口| 欧美大片在线播放| 欧美两根一起进3p做受视频| 国产色视频在线播放| youjizz.com亚洲| 激情小视频网站| 熟女性饥渴一区二区三区| 九九热精品在线播放| 日本不卡一区二区三区四区| 国产一区二区片| 国产免费黄色av| 中文字幕成人免费视频| 中国黄色录像片| 欧美 日韩 国产在线观看| 91激情视频在线| 男女h黄动漫啪啪无遮挡软件| 欧美精品久久久久久久自慰| 污污视频网站免费观看| 香蕉视频xxxx| 国产99久久九九精品无码| 欧美大尺度做爰床戏| 日韩精品一区二区三区电影| 国产欧美高清在线| 国产又粗又爽又黄的视频| 草草久久久无码国产专区| 欧洲在线免费视频| 五月丁香综合缴情六月小说| 可以看污的网站| 国产v片免费观看| 91小视频在线播放| 免费看又黄又无码的网站| 加勒比av中文字幕| 91成人在线观看喷潮教学| 成人不卡免费视频| 五十路熟女丰满大屁股| 亚洲第一成肉网| 日韩欧美国产免费| 法国空姐在线观看免费| 久久精品一区二| 国产免费一区二区视频| 亚洲天堂网站在线| 国产极品美女高潮无套久久久| 自拍偷拍视频在线| 蜜臀av免费观看| 国产性xxxx18免费观看视频| 大片在线观看网站免费收看| 久久久久xxxx| 男人插女人下面免费视频| 日本a在线免费观看| 日本xxx免费| 在线免费看v片| 国产精品视频黄色| 欧美v在线观看| 精品国产一区二区三区无码| 在线成人免费av| 午夜国产一区二区三区| 免费在线观看亚洲视频| 黄网站色视频免费观看| 91麻豆天美传媒在线| 午夜免费看视频| 国产成人手机视频| 男人日女人bb视频| 精品国产一二三四区| 91黄色在线看| 日韩一级性生活片| 久久成人福利视频| 国产免费裸体视频| 国产精品久久久久9999爆乳| 日韩专区第三页| 国产毛片久久久久久国产毛片| 天天在线免费视频| 少妇久久久久久被弄到高潮| 日本精品免费视频| 中文字幕日韩精品无码内射| av磁力番号网| 美女黄色免费看| 久久人人爽人人爽人人av| 国产高清www| www黄色日本| 国产精品亚洲二区在线观看| 日韩av片网站| 欧美在线a视频| 特色特色大片在线| 99久久免费观看| 欧美黑人经典片免费观看| 久久国产乱子伦免费精品| 欧美日韩在线免费播放| 成人黄色一区二区| 奇米视频7777| 国产免费内射又粗又爽密桃视频| 91亚洲精品国产| 欧美视频第三页| 中日韩av在线播放| 国产成人免费高清视频| 大陆av在线播放| 九色porny91| 日韩精品aaa| 男女视频网站在线观看| 午夜欧美福利视频| 欧美日韩一级在线| 无码专区aaaaaa免费视频| 国产天堂在线播放| www.午夜色| 亚洲国产精品久久久久婷蜜芽| 欧美黄色性生活| 狠狠干视频网站| 超碰影院在线观看| 91精品国产吴梦梦| 日韩中文字幕免费在线| 黄色www在线观看| 黄色国产精品视频| 午夜啪啪福利视频| 国产日韩一区二区在线| 91看片破解版| 成人一对一视频|