HANOI UNIVERSITY OF SCIENCE AND TECHNOLOGY
MASTER THESIS
Mang l■i tr■ nghi■m m■i m■ cho ng■■i dùng, công ngh■ hi■n th■ hi■n ■■i, b■n online khơng khác gì so v■i b■n g■c. B■n có th■ phóng to, thu nh■ tùy ý.
Learning LBSN data using graph neural network
HOANG THANH DAT
School of Information and Communication Technology
Supervisor: Assoc. Prof. Huynh Quyet Thang
Supervisor’s signature
School:
Information and Communication Technology
May 26, 2022
123doc
Xu■t
Sau
Nhi■u
h■n
phát
event
s■
m■t
t■
h■u
thú
ýn■m
t■■ng
m■t
v■,raevent
kho
■■i,
t■oth■
c■ng
ki■m
123doc
vi■n
■■ng
ti■n
kh■ng
■ãthi■t
t■ng
ki■m
l■
th■c.
b■■c
v■i
ti■nh■n
123doc
online
kh■ng
2.000.000
b■ng
ln
■■nh
ln
tàitài
v■
li■u
t■o
li■u
tríhi■u
c■
c■a
■ t■t
h■i
qu■
mình
c■
gianh■t,
trong
l■nh
t■nguy
v■c:
l■nh
thu
tínnh■p
tài
v■c
cao
chính
nh■t.
tài
online
li■u
tínMong
cho
d■ng,
và kinh
t■t
mu■n
cơng
c■
doanh
các
mang
ngh■
online.
thành
l■i
thơng
cho
viên
Tính
tin,
c■ng
c■a
■■n
ngo■i
website.
■■ng
th■i
ng■,...Khách
■i■m
xã h■itháng
m■thàng
ngu■n
5/2014;
có th■
tài
123doc
ngun
d■ dàng
v■■t
tri tra
th■c
m■c
c■u
q
100.000
tàibáu,
li■uphong
m■t
l■■t cách
truy
phú,c■p
chính
■am■i
d■ng,
xác,
ngày,
nhanh
giàus■
giá
chóng.
h■u
tr■ 2.000.000
■■ng th■ithành
mongviên
mu■n
■■ng
t■oký,
■i■u
l■t ki■n
vào top
cho200
chocác
cácwebsite
users cóph■
thêm
bi■n
thunh■t
nh■p.
t■iChính
Vi■t Nam,
vì v■yt■123doc.net
l■ tìm ki■m
ra thu■c
■■i nh■m
top 3■áp
Google.
■ng Nh■n
nhu c■u
■■■c
chiadanh
s■ tài
hi■u
li■udo
ch■t
c■ng
l■■ng
■■ng
vàbình
ki■mch■n
ti■n là
online.
website ki■m ti■n online hi■u qu■ và uy tín nh■t.
Nhi■u
123doc
Sau
Th■a
khi
thu■n
event
s■
cam
nh■n
h■u
k■t
s■
thú
xác
m■t
d■ng
v■,
s■
nh■n
mang
event
kho
1. t■
th■
l■i
ki■m
■■ng
CH■P
vi■n
nh■ng
ti■n
h■
kh■ng
NH■N
quy■n
th■ng
thi■tl■
CÁC
th■c.
s■
l■i
v■ichuy■n
■I■U
t■t
h■n
123doc
nh■t
2.000.000
KHO■N
sang
ln
cho ng■■i
ph■n
ln
TH■A
tàit■o
li■u
thơng
dùng.
THU■N
c■
■ tin
t■t
h■i
Khixác
c■
khách
giaminh
l■nh
t■ng
Chào
hàng
tài
v■c:
thu
m■ng
kho■n
tr■
nh■p
tài thành
b■n
chính
email
online
■■n
thành
tínb■n
cho
d■ng,
v■i
viên
■ã
t■t
123doc.
123doc.net!
cơng
■■ng
c■a
c■ các
ngh■
123doc
kýthành
v■i
Chúng
thơng
và
123doc.netLink
viên
n■p
tơi
tin,
c■a
cung
ti■n
ngo■i
website.
vào
c■p
ng■,...Khách
xác
tài
D■ch
kho■n
th■c
V■
s■
c■a
(nh■
hàng
■■■c
123doc,
■■■c
cóg■i
th■v■
mơ
b■n
d■■■a
t■
dàng
s■
d■■i
■■■c
ch■
tra■ây)
email
c■u
h■■ng
cho
tài
b■n
li■u
b■n,
nh■ng
■ã
m■t
tùy
■■ng
quy■n
cách
thu■c
ky,
chính
l■i
b■n
vàosau
xác,
các
vuin■p
lịng
“■i■u
nhanh
ti■n
■■ng
Kho■n
chóng.
trên
nh■p
website
Th■a
email
Thu■n
c■a v■
mình
S■vàD■ng
click D■ch
vào link
V■”
123doc
sau ■ây
■ã (sau
g■i ■ây ■■■c g■i t■t T■i t■ng th■i ■i■m, chúng tơi có th■ c■p nh■t ■KTTSDDV theo quy■t ...
Nhi■u
Mang
Ln
123doc
Th■a
Xu■t
Sau
khi
h■n
h■■ng
phát
thu■n
l■i
event
s■
cam
nh■n
m■t
tr■
t■
h■u
k■t
s■
thú
nghi■m
t■i
ýxác
n■m
t■■ng
m■t
d■ng
v■,
là
s■
nh■n
website
ra
mang
event
kho
m■i
■■i,
1.
t■o
t■
th■
m■
l■i
c■ng
ki■m
■■ng
d■n
123doc
CH■P
vi■n
nh■ng
cho
■■u
■■ng
ti■n
h■
kh■ng
ng■■i
NH■N
■ã
quy■n
th■ng
thi■t
chia
t■ng
ki■m
dùng,
l■
CÁC
s■
th■c.
s■
l■i
b■■c
v■i
ti■n
vàchuy■n
■I■U
t■t
cơng
h■n
mua
123doc
online
kh■ng
nh■t
2.000.000
ngh■
bán
KHO■N
sang
b■ng
ln
cho
tài
■■nh
hi■n
ng■■i
li■u
ph■n
ln
tài
TH■A
tài
v■
th■
li■u
hàng
t■o
li■u
thơng
dùng.
tríhi■n
THU■N
hi■u
c■
c■a
■■u
■ tin
t■t
h■i
Khi
■■i,
qu■
mình
Vi■t
xác
c■
khách
gia
b■n
nh■t,
minh
trong
l■nh
Nam.
t■ng
Chào
online
hàng
uy
tài
v■c:
l■nh
thu
Tác
m■ng
tín
kho■n
tr■
nh■p
khơng
tài
phong
v■c
cao
thành
b■n
chính
email
nh■t.
tài
online
khác
chun
■■n
li■u
thành
tínb■n
Mong
gì
cho
d■ng,
và
v■i
so
nghi■p,
viên
kinh
■ã
t■t
123doc.
123doc.net!
v■i
mu■n
cơng
■■ng
c■a
c■
doanh
b■n
các
hồn
mang
ngh■
123doc
ký
g■c.
online.
thành
v■i
h■o,
Chúng
l■i
thơng
B■n
và
123doc.netLink
cho
viên
Tính
■■
n■p
có
tơi
tin,
c■ng
c■a
cao
th■
■■n
cung
ti■n
ngo■i
tính
website.
phóng
■■ng
th■i
vào
c■p
ng■,...Khách
trách
xác
tài
■i■m
D■ch
xã
to,kho■n
th■c
nhi■m
h■i
thutháng
V■
nh■
m■t
s■
c■a
(nh■
■■i
hàng
■■■c
tùy
ngu■n
5/2014;
123doc,
v■i
■■■c
ý.
cóg■i
t■ng
th■
tài
123doc
v■
mơ
ngun
b■n
d■
ng■■i
■■a
t■
dàng
s■
v■■t
d■■i
tri
dùng.
■■■c
ch■
tra
th■c
m■c
■ây)
email
c■u
M■c
h■■ng
q
100.000
cho
tài
b■n
tiêu
báu,
li■u
b■n,
nh■ng
■ã
hàng
phong
m■t
l■■t
tùy
■■ng
■■u
quy■n
cách
truy
thu■c
phú,
ky,
c■a
c■p
chính
■a
l■i
b■n
vào
123doc.net
m■i
d■ng,
sau
xác,
các
vuingày,
n■p
lịng
“■i■u
nhanh
giàu
ti■n
s■
■■ng
tr■
giá
Kho■n
chóng.
h■u
trên
thành
tr■
nh■p
2.000.000
website
■■ng
Th■a
th■
email
vi■n
th■i
Thu■n
c■a
thành
mong
tài v■
li■u
mình
viên
mu■n
S■
online
và
■■ng
D■ng
click
t■o
l■n
ký,
D■ch
■i■u
vào
nh■t
l■t
link
ki■n
V■”
vào
Vi■t
123doc
top
sau
cho
Nam,
200
■ây
cho
■ã
cung
các
các
(sau
g■iwebsite
c■p
users
■âynh■ng
■■■c
cóph■
thêm
tài
bi■n
g■i
thu
li■u
t■t
nh■t
nh■p.
■■c
T■it■i
khơng
t■ng
Chính
Vi■tth■i
th■
Nam,
vì v■y
■i■m,
tìm
t■123doc.net
th■y
l■chúng
tìm
trên
ki■m
tơi
th■
racóthu■c
■■i
tr■■ng
th■nh■m
c■p
top
ngo■i
3nh■t
■áp
Google.
tr■
■KTTSDDV
■ng
123doc.net.
Nh■n
nhu c■u
■■■c
theo
chiaquy■t
danh
s■ tài
hi■u
...li■udo
ch■t
c■ng
l■■ng
■■ng
vàbình
ki■mch■n
ti■n là
online.
website ki■m ti■n online hi■u qu■ và uy tín nh■t.
Mangh■n
Ln
123doc
Th■a
Xu■t
Sau
Nhi■u
khi
h■■ng
phát
thu■n
l■i
event
s■
cam
nh■n
m■t
tr■
t■
h■u
k■t
s■
thú
nghi■m
t■i
ýxác
n■m
t■■ng
m■t
d■ng
v■,
là
s■
nh■n
website
ra
mang
event
kho
m■i
■■i,
1.
t■o
t■
th■
m■
l■i
c■ng
ki■m
■■ng
d■n
123doc
CH■P
vi■n
nh■ng
cho
■■u
■■ng
ti■n
h■
kh■ng
ng■■i
NH■N
■ã
quy■n
th■ng
thi■t
chia
t■ng
ki■m
dùng,
l■
CÁC
s■
th■c.
s■
l■i
b■■c
v■i
ti■n
vàchuy■n
■I■U
t■t
cơng
h■n
mua
123doc
online
kh■ng
nh■t
2.000.000
ngh■
bán
KHO■N
sang
b■ng
ln
cho
tài
■■nh
hi■n
ng■■i
li■u
ph■n
ln
tài
TH■A
tài
v■
th■
li■u
hàng
t■o
li■u
thơng
dùng.
tríhi■n
THU■N
hi■u
c■
c■a
■■u
■ tin
t■t
h■i
Khi
■■i,
qu■
mình
Vi■t
xác
c■
khách
gia
b■n
nh■t,
minh
trong
l■nh
Nam.
t■ng
Chào
online
hàng
uy
tài
v■c:
l■nh
thu
Tác
m■ng
tín
kho■n
tr■
nh■p
khơng
tài
phong
v■c
cao
thành
b■n
chính
email
nh■t.
tài
online
khác
chun
■■n
li■u
thành
tínb■n
Mong
gì
cho
d■ng,
và
v■i
so
nghi■p,
viên
kinh
■ã
t■t
123doc.
123doc.net!
v■i
mu■n
cơng
■■ng
c■a
c■
doanh
b■n
các
hồn
mang
ngh■
123doc
ký
g■c.
online.
thành
v■i
h■o,
Chúng
l■i
thơng
B■n
và
123doc.netLink
cho
viên
Tính
■■
n■p
có
tơi
tin,
c■ng
c■a
cao
th■
■■n
cung
ti■n
ngo■i
tính
website.
phóng
■■ng
th■i
vào
c■p
ng■,...Khách
trách
xác
tài
■i■m
D■ch
xã
to,kho■n
th■c
nhi■m
h■i
thutháng
V■
nh■
m■t
s■
c■a
(nh■
■■i
hàng
■■■c
tùy
ngu■n
5/2014;
123doc,
v■i
■■■c
ý.
cóg■i
t■ng
th■
tài
123doc
v■
mơ
ngun
b■n
d■
ng■■i
■■a
t■
dàng
s■
v■■t
d■■i
tri
dùng.
■■■c
ch■
tra
th■c
m■c
■ây)
email
c■u
M■c
h■■ng
q
100.000
cho
tài
b■n
tiêu
báu,
li■u
b■n,
nh■ng
■ã
hàng
phong
m■t
l■■t
tùy
■■ng
■■u
quy■n
cách
truy
thu■c
phú,
ky,
c■a
c■p
chính
■a
l■i
b■n
vào
123doc.net
m■i
d■ng,
sau
xác,
các
vuingày,
n■p
lịng
“■i■u
nhanh
giàu
ti■n
s■
■■ng
tr■
giá
Kho■n
chóng.
h■u
trên
thành
tr■
nh■p
2.000.000
website
■■ng
Th■a
th■
email
vi■n
th■i
Thu■n
c■a
thành
mong
tài v■
li■u
mình
viên
mu■n
S■
online
và
■■ng
D■ng
click
t■o
l■n
ký,
D■ch
■i■u
vào
nh■t
l■t
link
ki■n
V■”
vào
Vi■t
123doc
top
sau
cho
Nam,
200
■ây
cho
■ã
cung
các
các
(sau
g■iwebsite
c■p
users
■âynh■ng
■■■c
cóph■
thêm
tài
bi■n
g■i
thu
li■u
t■t
nh■t
nh■p.
■■c
T■it■i
khơng
t■ng
Chính
Vi■tth■i
th■
Nam,
vì v■y
■i■m,
tìm
t■123doc.net
th■y
l■chúng
tìm
trên
ki■m
tơi
th■
racóthu■c
■■i
tr■■ng
th■nh■m
c■p
top
ngo■i
3nh■t
■áp
Google.
tr■
■KTTSDDV
■ng
123doc.net.
Nh■n
nhu c■u
■■■c
theo
chiaquy■t
danh
s■ tài
hi■u
...li■udo
ch■t
c■ng
l■■ng
■■ng
vàbình
ki■mch■n
ti■n là
online.
website ki■m ti■n online hi■u qu■ và uy tín nh■t.
Lnh■n
123doc
Th■a
Xu■t
Sau
khi
h■■ng
phát
thu■n
cam
nh■n
m■t
t■k■t
s■
t■i
ýxác
n■m
t■■ng
d■ng
là
s■
nh■n
website
ra
mang
■■i,
1.
t■o
t■l■i
c■ng
■■ng
d■n
123doc
CH■P
nh■ng
■■u
■■ng
h■
NH■N
■ã
quy■n
th■ng
chia
t■ng
ki■m
CÁC
s■s■
l■i
b■■c
ti■n
vàchuy■n
■I■U
t■t
mua
online
kh■ng
nh■t
bán
KHO■N
sang
b■ng
cho
tài
■■nh
ng■■i
li■u
ph■n
tài
TH■A
v■
li■u
hàng
thơng
dùng.
tríTHU■N
hi■u
c■a
■■u
tin
Khi
qu■
mình
Vi■t
xác
khách
nh■t,
minh
trong
Nam.
Chào
hàng
uy
tài
l■nh
Tác
m■ng
tín
kho■n
tr■
phong
v■c
cao
thành
b■n
email
nh■t.
tàichun
■■n
li■u
thành
b■n
Mong
và
v■i
nghi■p,
viên
kinh
■ã
123doc.
123doc.net!
mu■n
■■ng
c■a
doanh
hồn
mang
123doc
kýonline.
v■i
h■o,
Chúng
l■ivà
123doc.netLink
cho
Tính
■■
n■p
tơi
c■ng
cao
■■n
cung
ti■n
tính
■■ng
th■i
vào
c■p
trách
xác
tài
■i■m
D■ch
xãkho■n
th■c
nhi■m
h■itháng
V■
m■t
s■
c■a
(nh■
■■i
■■■c
ngu■n
5/2014;
123doc,
v■i
■■■c
g■i
t■ng
tài
123doc
v■
mơ
ngun
b■n
ng■■i
■■a
t■s■
v■■t
d■■i
tri
dùng.
■■■c
ch■
th■c
m■c
■ây)
email
M■c
h■■ng
q
100.000
cho
b■n
tiêu
báu,
b■n,
nh■ng
■ã
hàng
phong
l■■t
tùy
■■ng
■■u
quy■n
truy
thu■c
phú,
ky,
c■a
c■p
■a
l■i
b■n
vào
123doc.net
m■i
d■ng,
sau
các
vuingày,
n■p
lịng
“■i■u
giàu
ti■n
s■
■■ng
tr■
giá
Kho■n
h■u
trên
thành
tr■
nh■p
2.000.000
website
■■ng
Th■a
th■
email
vi■n
th■i
Thu■n
c■a
thành
mong
tài v■
li■u
mình
viên
mu■n
S■
online
và
■■ng
D■ng
click
t■o
l■n
ký,
D■ch
■i■u
vào
nh■t
l■t
link
ki■n
V■”
vào
Vi■t
123doc
top
sau
cho
Nam,
200
■ây
cho
■ã
cung
các
các
(sau
g■iwebsite
c■p
users
■âynh■ng
■■■c
cóph■
thêm
tài
bi■n
g■i
thu
li■u
t■t
nh■t
nh■p.
■■c
T■it■i
khơng
t■ng
Chính
Vi■tth■i
th■
Nam,
vì v■y
■i■m,
tìm
t■123doc.net
th■y
l■chúng
tìm
trên
ki■m
tơi
th■
racóthu■c
■■i
tr■■ng
th■nh■m
c■p
top
ngo■i
3nh■t
■áp
Google.
tr■
■KTTSDDV
■ng
123doc.net.
Nh■n
nhu c■u
■■■c
theo
chiaquy■t
danh
s■ tài
hi■u
...li■udo
ch■t
c■ng
l■■ng
■■ng
vàbình
ki■mch■n
ti■n là
online.
website ki■m ti■n online hi■u qu■ và uy tín nh■t.
Lnh■n
Th■a
Xu■t
Sau
Nhi■u
123doc
Mang
khi
h■■ng
phát
thu■n
l■i
event
cam
s■
nh■n
m■t
tr■
t■
h■u
k■t
s■
thú
nghi■m
t■i
ýxác
n■m
t■■ng
m■t
d■ng
v■,
là
s■
nh■n
website
ra
mang
event
kho
m■i
■■i,
1.
t■o
t■
th■
m■
l■i
c■ng
ki■m
■■ng
d■n
123doc
CH■P
vi■n
nh■ng
cho
■■u
■■ng
ti■n
h■
kh■ng
ng■■i
NH■N
■ã
quy■n
th■ng
thi■t
chia
t■ng
ki■m
dùng,
l■
CÁC
s■
th■c.
s■
l■i
b■■c
v■i
ti■n
vàchuy■n
■I■U
t■t
cơng
h■n
mua
123doc
online
kh■ng
nh■t
2.000.000
ngh■
bán
KHO■N
sang
b■ng
ln
cho
tài
■■nh
hi■n
ng■■i
li■u
ph■n
ln
tài
TH■A
tài
v■
th■
li■u
hàng
t■o
li■u
thơng
dùng.
tríhi■n
THU■N
hi■u
c■
c■a
■■u
■ tin
t■t
h■i
Khi
■■i,
qu■
mình
Vi■t
xác
c■
khách
gia
b■n
nh■t,
minh
trong
l■nh
Nam.
t■ng
Chào
online
hàng
uy
tài
v■c:
l■nh
thu
Tác
m■ng
tín
kho■n
tr■
nh■p
khơng
tài
phong
v■c
cao
thành
b■n
chính
email
nh■t.
tài
online
khác
chun
■■n
li■u
thành
tínb■n
Mong
gì
cho
d■ng,
và
v■i
so
nghi■p,
viên
kinh
■ã
t■t
123doc.
123doc.net!
v■i
mu■n
cơng
■■ng
c■a
c■
doanh
b■n
các
hồn
mang
ngh■
123doc
ký
g■c.
online.
thành
v■i
h■o,
Chúng
l■i
thơng
B■n
và
123doc.netLink
cho
viên
Tính
■■
n■p
có
tơi
tin,
c■ng
c■a
cao
th■
■■n
cung
ti■n
ngo■i
tính
website.
phóng
■■ng
th■i
vào
c■p
ng■,...Khách
trách
xác
tài
■i■m
D■ch
xã
to,kho■n
th■c
nhi■m
h■i
thutháng
V■
nh■
m■t
s■
c■a
(nh■
■■i
hàng
■■■c
tùy
ngu■n
5/2014;
123doc,
v■i
■■■c
ý.
cóg■i
t■ng
th■
tài
123doc
v■
mơ
ngun
b■n
d■
ng■■i
■■a
t■
dàng
s■
v■■t
d■■i
tri
dùng.
■■■c
ch■
tra
th■c
m■c
■ây)
email
c■u
M■c
h■■ng
q
100.000
cho
tài
b■n
tiêu
báu,
li■u
b■n,
nh■ng
■ã
hàng
phong
m■t
l■■t
tùy
■■ng
■■u
quy■n
cách
truy
thu■c
phú,
ky,
c■a
c■p
chính
■a
l■i
b■n
vào
123doc.net
m■i
d■ng,
sau
xác,
các
vuingày,
n■p
lịng
“■i■u
nhanh
giàu
ti■n
s■
■■ng
tr■
giá
Kho■n
chóng.
h■u
trên
thành
tr■
nh■p
2.000.000
website
■■ng
Th■a
th■
email
vi■n
th■i
Thu■n
c■a
thành
mong
tài v■
li■u
mình
viên
mu■n
S■
online
và
■■ng
D■ng
click
t■o
l■n
ký,
D■ch
■i■u
vào
nh■t
l■t
link
ki■n
V■”
vào
Vi■t
123doc
top
sau
cho
Nam,
200
■ây
cho
■ã
cung
các
các
(sau
g■iwebsite
c■p
users
■âynh■ng
■■■c
cóph■
thêm
tài
bi■n
g■i
thu
li■u
t■t
nh■t
nh■p.
■■c
T■it■i
khơng
t■ng
Chính
Vi■tth■i
th■
Nam,
vì v■y
■i■m,
tìm
t■123doc.net
th■y
l■chúng
tìm
trên
ki■m
tơi
th■
racóthu■c
■■i
tr■■ng
th■nh■m
c■p
top
ngo■i
3nh■t
■áp
Google.
tr■
■KTTSDDV
■ng
123doc.net.
Nh■n
nhu c■u
■■■c
theo
chiaquy■t
danh
s■ tài
hi■u
...li■udo
ch■t
c■ng
l■■ng
■■ng
vàbình
ki■mch■n
ti■n là
online.
website ki■m ti■n online hi■u qu■ và uy tín nh■t.
Vi■c
■■ng
Thành
s■
u■t
Nhi■u
Mang
Ln
123doc
Th■a
Xu■t
Sau
h■u
phát
khi
h■n
b■n
h■■ng
phát
thu■n
l■i
ýevent
viên
s■
cam
nh■n
r■ng
m■t
t■
m■t
tr■
s■
t■
h■u
s■
ýk■t
s■
thú
kho
nghi■m
t■i
ýd■ng
n■u
t■■ng
xác
n■m
ph■i
t■■ng
m■t
d■ng
v■,
là
s■
th■
nh■n
Thành
website
ra
ho■c
mang
th■c
event
t■o
kho
vi■n
m■i
■■i,
1.
t■o
t■
c■ng
th■
viên
■■ng
hi■n
m■
l■i
kh■ng
c■ng
ki■m
■■ng
d■n
123doc
CH■P
vi■n
nh■ng
ti■p
cho
theo
■■ng
■■u
ký
■■ng
ti■n
h■
l■
kh■ng
ng■■i
t■c
NH■N
s■
■ã
■úng
v■i
quy■n
th■ng
thi■t
chia
ki■m
d■ng
t■ng
s■
ki■m
h■n
dùng,
l■
các
CÁC
s■
d■ng
th■c.
ti■n
s■
l■i
b■■c
các
v■i
ti■n
2.000.000
và
ch■
chuy■n
■I■U
t■t
cơng
online
h■n
D■ch
mua
123doc
d■ch
online
kh■ng
d■n
nh■t
2.000.000
ngh■
bán
KHO■N
v■
b■ng
V■
■■■c
sang
tài
b■ng
ln
cho
tài
■■nh
c■a123doc.net
sau
li■u
hi■n
tài
ng■■i
li■u
ph■n
ln
tài
niêm
TH■A
khi
■
li■u
tài
v■
th■
li■u
hàng
t■t
t■o
■KTTSDDV
li■u
thơng
dùng.
trí
y■t
hi■u
hi■n
THU■N
c■
hi■u
c■
c■a
■■u
■
ho■c
l■nh
tin
qu■
■■ng
t■t
h■i
Khi
■■i,
qu■
mình
Vi■t
xác
c■
khách
gia
các
v■c:
nh■t,
■■■c
b■n
nh■t,
ngh■a
minh
trong
l■nh
Nam.
t■ng
Chào
quy
tài
online
uy
hàng
uy
c■p
tài
v■c:
■■nh
chính
l■nh
thu
Tác
tín
v■i
m■ng
tín
kho■n
tr■
cao
nh■t,
nh■p
khơng
tài
vi■c
phong
v■c
cao
tín
áp
thành
b■n
chính
nh■t.
d■ng,
d■ng
email
nh■t.
tài
b■n
vi■c
online
khác
chun
■■n
li■u
thành
tín
Mong
■ã
■ó
cho
b■n
cơng
Mong
gì
cho
d■ng,
và
v■i
■■ng
có
so
các
nghi■p,
viên
ki
kinh
■ã
mu■n
t■t
ngh■
123doc.
123doc.net!
ngh■a
v■i
mu■n
123doc
cơng
d■ch
■■n■
■■ng
c■a
c■
cwebsite.
ýdoanh
b■n
v■i
thơng
mang
các
hồn
mang
là
ngh■
123doc
v■
ký
v■■t
g■c.
các
■■a
Thàn
online.
thành
■ó
v■i■ng
v■i
l■i
tin,
h■o,
Chúng
Chún
■i■u
l■i
thơng
B■n
ch■
m■c
có
cho
ngo■i
và
là
123doc.netLink
chogun
cho
viên
Tính
■■
website
th■
mơ
n■p
kho■n
email
có
c■ng
tơi
tin,
ky,
100.000
c■ng
c■a
cao
ng■,...Khách
t■
■■■c
th■
■■n
cung
ti■n
b■n
ngo■i
d■■i
b■n
■■ng
tính
c■a
ki■m
website.
phóng
■■ng
trith■i
vào
c■p
vui
l■■t
niêm
th■c
ng■,...Khách
■ã
trách
n■ây)
xác
lịng
xã
ti■n
tài
■i■m
khơng
D■ch
xã
to,
■■ng
truy
y■t
q
h■i
kho■n
th■c
hànnh
nhi■m
h■i
cho
thu
■■ng
online
c■p
theo
m■t
báu,
tháng
V■
■■ng
ky,
nh■
m■t
b■n,
s■
c■a
xác,
m■i
(nh■
■■i
nh■p
hi■u
hàng
t■ng
ngu■n
b■n
phong
■■■c
tùy
ngu■n
5/2014;
ýtùy
123doc,
nhanh
v■i
Mong
ngày,
vui
■■■c
qu■
ý.
email
th■i
có
thu■c
phú,
tài
g■i
t■ng
lịng
th■
tài
123doc
và
s■
■i■m.
mu■n
ngun
chóng.
c■a
v■
mơ
ngun
b■n
■a
vào
uy
d■
■■ng
ng■■i
h■u
■■a
t■
tín
d■ng,
mình
dàng
các
s■
man
T■t
v■■t
tri
2.000.000
d■■i
nh■t.
nh■p
tri
dùng.
■■■c
ch■
th■c
“■i■u
c■
và
ngun
tra
th■c
giàu
m■c
■ây)
click
các
email
c■u
email
q
M■c
h■■ng
giá
Kho■n
q
100.000
thành
ocho
vào
tri
tài
báu,
tr■
b■nn
b■n
c■a
tiêu
báu,
th■c
li■u
b■n,
link
■■ng
nh■ng
Th■a
viên
phong
■ã
hàng
mình
phong
viên
m■t
l■■t
q
123doc
tùy
■■ng
■■ng
th■i
Thu■n
■■u
c■a
báo
và
phú,
quy■n
cách
truy
thu■c
phú,
click
mong
■ã
ky,
các
ký,
website.
c■a
c■p
■a
chính
v■
■a
l■i
b■n
g■i
vào
l■t
vào
users
d■ng,
123doc.net
m■i
S■
mu■n
d■ng,
sau
vào
xác,
các
link
vui
D■ng
ngày,
có
n■p
giàu
top
lịng
“■i■u
123doc
nhanh
t■o
giàu
thêm
200
ti■n
D■ch
giá
s■
■■ng
■i■u
tr■
giá
Kho■n
thu
chóng.
các
h■u
tr■
■ã
trên
thành
tr■
V■”
ki■n
nh■p.
nh■p
■■ng
g■i
website
2.000.000
website
■■ng
Th■a
sau
th■
cho
email
Chính
th■i
■ây
vi■n
th■i
ph■
Thu■n
chomong
c■a
thành
vì
(sau
mong
các
tài
bi■n
v■y
v■
li■u
mình
users
mu■n
■ây
viên
nh■t
mu■n
S■
123doc.net
online
và
■■■c
■■ng
có
D■ng
t■i
t■o
click
t■o
thêm
l■n
Vi■t
■i■u
g■i
ký,
D■ch
■i■u
vào
ra
nh■t
thu
Nam,
l■t
t■t
■■i
link
ki■n
nh■p.
ki■n
V■”
vào
T■i
Vi■t
123doc
nh■m
t■
cho
top
sau
cho
t■ng
l■
Nam,
Chính
cho
200
tìm
■ây
■áp
cho
■ã
th■i
cung
các
ki■m
các
vìcác
(sau
g■i
■ng
v■y
■i■m,
users
website
c■p
users
thu■c
■ây
nhu
123doc.net
nh■ng
có
chúng
c■u
■■■c
có
top
ph■
thêm
thêm
chia
3tơi
tài
bi■n
Google.
g■i
thu
ra
có
thu
li■u
s■
■■i
t■t
nh■p.
th■
nh■t
nh■p.
tài
■■c
T■i
Nh■n
nh■m
li■u
c■p
t■i
Chính
khơng
t■ng
Chính
ch■t
nh■t
Vi■t
■■■c
■áp
th■i
vìth■
l■■ng
Nam,
■KTTSDDV
vì■ng
v■y
v■y
danh
■i■m,
tìm
123doc.net
nhu
t■
và
123doc.net
th■y
hi■u
l■
ki■m
chúng
c■u
tìm
trên
theo
do
chia
ki■m
ti■n
c■ng
tơi
ra
th■
quy■t
ra
s■
có
■■i
online.
thu■c
■■i
tr■■ng
■■ng
th■
tài...
nh■m
nh■m
li■u
c■p
top
bình
ngo■i
ch■t
■áp
3nh■t
■áp
Google.
ch■n
l■■ng
■ng
tr■
■KTTSDDV
■ng
123doc.net.
lànhu
Nh■n
nhu
website
vàc■u
ki■m
c■u
■■■c
chia
theo
ki■m
chia
ti■n
s■
quy■t
danh
s■
online.
ti■n
tàitài
hi■u
li■u
online
...li■uch■t
do
ch■t
hi■u
c■ng
l■■ng
l■■ng
qu■
■■ng
vàvàki■m
uy
bình
ki■m
tín ch■n
ti■n
nh■t.
ti■nonline.
là
online.
website ki■m ti■n online hi■u qu■ và uy tín nh■t.
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep
CỘNG HÒA XÃ HỘI CHỦ NGHĨA VIỆT NAM
Độc lập – Tự do – Hạnh phúc
BẢN XÁC NHẬN CHỈNH SỬA LUẬN VĂN THẠC SĨ
Họ và tên tác giả luận văn: Hoàng Thành Đạt
Đề tài luận văn: Học biểu diễn dữ liệu LBSN sử dụng mạng neural trên đồ thị
Chuyên ngành: Khoa học dữ liệu và Trí tuệ nhân tạo
Mã số SV: 20202714M
Tác giả, Người hướng dẫn khoa học và Hội đồng chấm luận văn xác nhận tác giả
đã sửa chữa, bổ sung luận văn theo biên bản họp Hội đồng ngày 28/04/2022 với
các nội dung sau:
1)
Sửa từ viết tắt Asso. Prof thành Assoc. Prof.
2)
Thêm bảng danh mục từ viết tắt ở trước mục Introduction.
3)
Sửa một số lỗi soạn thảo, đặc biệt ở phần mô tả Hypergraph Convolution
(mục 2.2.5), liên quan tới các ký hiệu tốn học.
4)
Thêm ví dụ minh họa và mô tả về node degree và hyperedge degree ở phần
2.2.5.
5)
Bỏ phần 3.1 Introduction do trùng lặp nội dung với các phần trước đó, và
thay bằng đoạn mơ tả ngắn nối chương 2 với chương 3.
6)
Mô tả kỹ hơn về dataset ở mục 4.1.1. Thêm thống kê số lượng siêu đỉnh,
siêu cạnh trên đồ thị và giải thích giá trị 168.
7)
Thêm mục 4.1.2 về Implementation, trong đó mơ tả mã nguồn của phương
pháp đề xuất, baselines, và cấu hình máy thử nghiệm.
8)
Ngày 23 tháng 05 năm 2022
Giáo viên hướng dẫn
Tác giả luận văn
CHỦ TỊCH HỘI ĐỒNG
SĐH.QT9.BM11
Ban hành lần 1 ngày 11/11/2014
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep
Graduation Thesis Assignment
Name: Hoang Thanh Dat
Phone: +84343407959
Email: ;
Class: 20BKHDL-E
Affiliation: Hanoi University of Science and Technology
Hoang Thanh Dat - hereby warrants that the work and presentation in this thesis performed by myself under the supervision of Assoc. Prof. Huynh Quyet
Thang. All the results presented in this thesis are truthful and are not copied from
any other works. All references in this thesis including images, tables, figures,
and quotes are clearly and fully documented in the bibliography. I will take full
responsibility for even one copy that violates school regulations.
Student
Signature and Name
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep
Acknowledgement
I would like to express my gratitude to my primary supervisor, Assoc. Prof.
Huynh Quyet Thang, who not only guided me throughout this project but also
encouraged me during my 5 years of university. I would also wish to show my
appreciation to Mr. Huynh Thanh Trung who read my numerous revisions and
helped make some sense of the confusion. I also want to extend my special
thanks to Dr. Nguyen Quoc Viet Hung who inspired me and helped me in my
the research career.
I would like to thank to all the lecturers in the School of Information and Communication Technology, who provided valuable knowledges and experiences during
the Master program. I also like to thank my friends and family who supported me
and offered deep insight into the study, especially Mr. Tong Van Vinh and Mr.
Pham Minh Tam who supported me with the experimental machines and expand
my analysis for this work.
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep
Abstract
Location-based service networks (LBSNs) such as Facebook, Instagram have
emerged recently and attracted millions of people[1], allowing users to share their
real-time experiences via checkins. LBSN data has become a primary source for
various applications from studying human mobility and social network analysis[2][3]. In LBSNs, there are two essential tasks, friendship prediction and POI
recommendation that have been widely researched. While the friendship prediction aims to suggest the social relationship that will be formed in the future, the
POI recommendation predicts the location user will visit in a given time. The
two task is correlated, using the mobility data can greatly enhance the friendship
prediction performance[4] and vise versa.
Traditional approaches often require expert domain knowledge, designing a set of
hand-crafted features from user mobility data (e.g. co-location rates[5][3]) or user
friendships (e.g. Katz index[3][6]) and combine those features for downstream
tasks. These approaches, though requiring huge human efforts and domain experts but lack of generalizability to different applications[7]. Recent techniques
capture the joint interactions between social relationship and user mobility by applying graph embedding techniques[8][9][10]. The graph embedding techniques
embed the nodes into low-dimensional embedding spaces that can be transferred
into downstream tasks but can only learn the information from pairwise relationships, they can not handle the complex characteristics of a checkin and divide
a checkin into classical pairwise edges, thus result in loss of information. We
analyse that LBSN graph is heterogeneity and indecomposable, thus traditional
techniques on classical graphs can not capture the deep semantic in LBSN data.
Recently, the Graph Neural Networks (GNNs) have attracted wide attention [11][12]
due to their capability in capturing the complex structural context in a graph.
Traditional graph neural network, however, can only learn pairwise relationships.
Therefore, hypergraph convolution[13] was released to model the hypergraph and
learn the n-wise proximity from hyperedges.
In this work, we propose HC-LBSN, a heterogeneous hypergraph convolution
for LBSN task. Using the LBSN data, our method first constructing the LBSN
heterogeneous hypergraph which contains four types of nodes (user, time stamp,
POI and category) and two types of hyperedges (friendship and checkin). Hence,
we apply several hypergraph convolution layers to capture the complex structural
context. The embedding of every nodes are learned in a unified vector space, however, we do not directly compare the similarity of nodes in the encoding space for
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep
downstream tasks but stack the decoding layers to transform encoded node vectors into comparable vector spaces. We analyse that the two essential downstream
tasks can be transformed into one problem: scoring hyperedges (see section 3.5
for more details). Therefore, we apply a common method for the both tasks. In
particular, for each hyperedge candidate, a hyperedge embedding is generated
and pass into a N-tuplewise similarity function in order to measure its existence.
Extensive experiments illustrate the improvement of our model over baseline approaches, proving that our method can capture the deep semantic in LBSN data,
and dealing with the heterogeneity and indecomposable of LBSN hypergraph.
The analysis of hyperparameter sensitivity has shown that future work should
overcome the balance parameters to automatically adjust an appropriate parameter for various datasets, and also apply the graph attention mechanism.
The result of this work can be applied to analyse and build features for a social
network platform such as suggesting friends and recommending tourist places.
The proposed model can also be applied for learning on different dataset in other
domains due to its generalizability and no expert knowledge requirement.
Student
Signature and Name
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep
TABLE OF CONTENTS
CHAPTER 1. INRODUCTION.............................................................
1
1.1 Location-based social networks (LBSNs)..............................................
1
1.2 Research history .................................................................................
3
1.3 Research challenges............................................................................
4
1.4 Our proposed method..........................................................................
6
1.5 Contributions and Thesis Outline .........................................................
7
1.6 Selected Publications ..........................................................................
9
CHAPTER 2. BACKGROUND ............................................................. 11
2.1 Learning on LBSNs data ..................................................................... 11
2.2 Graph Embedding Techniques ............................................................. 11
2.2.1 Overview ................................................................................ 11
2.2.2 Deepwalk ................................................................................ 13
2.2.3 Graph Neural Networks ............................................................ 14
2.2.4 Heterogeneous graph learning ................................................... 16
2.2.5 Hypergraph and Hypergraph convolution ................................... 20
CHAPTER 3. MULTITASK LEARNING FOR LBSNs USING HYPERGRAPH CONVOLUTION .............................................................. 24
3.1 Framework Overview.......................................................................... 24
3.2 Notations and Definitions .................................................................... 26
3.3 LBSNs Hypergraph Construction ......................................................... 26
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep
3.4 Hypergraph convolution ...................................................................... 28
3.5 Loss function ..................................................................................... 29
3.6 Hyperedge embedding function ........................................................... 32
3.7 Optimization ...................................................................................... 33
CHAPTER 4. EXPERIMENTS............................................................. 34
4.1 Setting ............................................................................................... 34
4.1.1 Datasets .................................................................................. 34
4.1.2 Implementation........................................................................ 35
4.1.3 Downstream tasks and metrics .................................................. 36
4.1.4 Baselines................................................................................. 37
4.1.5 Hyperparameter setting............................................................. 37
4.2 End-to-end comparison ....................................................................... 38
4.2.1 Friendship prediction................................................................ 38
4.2.2 POI recommendation................................................................ 39
4.3 The effectiveness of hyperedge embedding functions ............................. 42
4.4 Hyperparameter sensitivity .................................................................. 44
4.4.1 Checkin hyperedge weight ........................................................ 44
4.4.2 The number of hypergraph convolution layers ............................ 46
CHAPTER 5. CONCLUSION............................................................... 49
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep
LIST OF FIGURES
1.1
Example of LBSNs . . . . . . . . . . . . . . . . . . . . . . . . . .
2.1
Illustration of a general GNN inductive framework on a specific
2
node (the red node)[12]. . . . . . . . . . . . . . . . . . . . . . . . . 15
2.2
Example of heterogeneous graph for bibliographic network[33]. . 16
2.3
Example of a heterogeneous graph where node a and b should
have similar embedding, different node colors represent for different node types. . . . . . . . . . . . . . . . . . . . . . . . . . . . 19
2.4
The different between a simple graph (a) and a hypergraph (b)[13].
. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 20
2.5
Example of node degrees on a simple graph (a) and a hypergraph
(b). . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 22
2.6
Example of node degrees on a weighted simple graph (a) and a
weighted hypergraph (b). . . . . . . . . . . . . . . . . . . . . . . . 22
3.1
Illustrate the HC-LBSN framework. . . . . . . . . . . . . . . . . . 25
4.1
Friendship prediction performance of HC-LBSN (blue line) with
other techniques on four experimental datasets SP, KL, JK and IST. 40
4.2
POI recommendation performance (Hit@3) of HC-LBSN comparing to other techniques on experimental datasets IST, SP, KL
and JK. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 41
4.3
POI recommendation performance (Hit@5) of HC-LBSN comparing to other techniques on experimental datasets IST, SP, KL
and JK. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 41
4.4
POI recommendation performance (Hit@10) of HC-LBSN comparing to other techniques on experimental datasets IST, SP, KL
and JK. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 42
4.5
Friendship prediction performance on SP, KL, JK and IST dataset
with various hyperedge embedding functions. . . . . . . . . . . . 42
4.6
POI recommendation performance on SP, KL, JK and IST dataset
with various hyperedge embedding functions. . . . . . . . . . . . 43
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep
4.7
Friendship prediction performance of HC-LBSN when increasing
the checkin hyperedge weight in four experimental datasets. . . . 45
4.8
POI recommendation performance of HC-LBSN when increasing
the checkin hyperedge weight in four experimental datasets. . . . 45
4.9
Friendship prediction performance of HC-LBSN when increasing
the number of hypergraph convolution layers in SP, JK and KL
dataset. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 47
4.10 POI recommendation performance of HC-LBSN when increasing
the number of hypergraph convolution layers in SP, JK and KL
dataset. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 47
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep
LIST OF TABLES
3.1
Notations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 26
4.1
Statistics of the datasets . . . . . . . . . . . . . . . . . . . . . . . . 35
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep
ACRONYMS
Notation Description
GCN
Graph Convolutional Network
GNN
Graph Neural Network
LBSN
Location-based social network
MLP
Multilayer Perceptron
POI
Point Of Interest
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep
CHAPTER 1. INRODUCTION
1.1
Location-based social networks (LBSNs)
A social network is a framework where users can interact with others in many
form, such as friendship, common interests, and shared knowledge. Generally,
a social networking service builds on and reflects the real-life social networks
among people through online platforms such as a website, providing ways for
users to share ideas, activities, events, and interests over the Internet.[14]
With the development of location-acquisition technology (e.g. GPS and WiFi),
people can add a location to existing online social network. For instance, people can share their photos with their current location, called checkins such as in
Facebook and Twitter. They can also interact with the others like commenting on
a hotel page in social network, and browser other people reviews on that hotel.
By adding the location into social network, the gap between social network and
reality becomes closer, and such social networks are called Location-based social
networks (LBSNs). These LBSNs such as Foursquare, Gowalla, Facebook Local
or Yelp have emerged recently and attracted million of users [1]. In LBSNs, users
can share their real-time experiences via checkins, which includes four essential
informations, a user, a specific timestamp, an activity (e.g. swimming, trekking)
and a POI (point of interest) such as a swimming pool, supermarket.
Since LBSN data includes objects and various relationships between them, researchers often model LBSN data using a graph. Figure 1.1 illustrates an example of a LBSN graph which contains four kinds of nodes: user, time, POI and
semantic activity. In LBSNs, users, presented by u, play a central role. Users
can establish relations with each other, such relations between users are called
friendships. In figure 1.1, the friendship edges are represented by blue dash lines,
for example, the relations between users (u1 , u2 ), (u3 , u4 ). Users can also form
relations with other node types such as locations, activities. A user’s activity in
LBSNs is called checkin which is formed by a quadtuple including four crucial
informations (user, time stamp, POI, semantic), denoted by (u, t, p, s). A POI p
represents for a tagged location of a checkin while s indicates the activity of user
such as swimming or shopping. A time stamp t is the time when checkin occurs.
For a specific POI, different semantic activities can occur like shopping, eating,
hanging out at a shopping mall.
LBSN data contains rich socio-spatial properties of user activities. LBSNs offer
1
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep
Figure 1.1: Example of LBSNs
many new research challenges, it is a primary data source to study human mobility and social network analysis[2][3]. Two typical applications from LBSN data
have been widely investigated, friendship prediction and POI recommendation.
In which, the former suggests social friendships that likely will be established in
the future. For example, people who enjoy the same activity at the same place
within overlapping period potentially meet and make friend with each other, due
to a common presence and hobby. POI recommendation aims at predicting the
place people will visit at a given time. For instance, people in a community have
high chance to collaborate in the same activity (e.g. students attending a class,
gym members working out at a gym).
Though LBSN data has been studied widely, effectively capturing the structures
of LBSN graph still remain challenging due to its heterogeneity and indecomposable of hyperedges. In the next section, we describe the history researches of
LBSNs with several baseline approaches and bold the research challenges which
will be considered in the new proposed method.
2
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep
1.2
Research history
Recently, location-based social networks such as Facebook Local, Yelp, Foursquare
or Gowalla have attracted million of users to share their daily experiences with
others. These platforms contain great pool of information and mostly accessible to the public. To this end, many researches attempts to learn the underlying
patterns of user behaviour in LBSNs[15].
Due to the intrinsic correlation between human mobility and social relationship, existing work has shown that considering such correlation can improve the
performance on both friendship prediction [16][17] and location prediction[17].
The earlier techniques often require expert domain knowledge, designing a set
of hand-crafted features from user mobility data. For example, Wang et al.[5]
defineds the co-location rate based on their observation in dataset and reality.
Specifically, they determine a threshold depended on the number of shared communication activities, if two people have many shared activities, they will potentially be friends in the future. Yang et al. [7] characterises user mobility based
on two criteria: the total time-independent travel distance and the probability of
returning to particular locations. Song et al. [18] uses a metric called mobility
entropy to reflect users’ daily mobility. Backstrom and Kleinberg [19] proposed
dispersion metric to estimate the tie strength between the connected users and detect the couples and romantic partners by their strong social bond pattern. These
approaches, however, require significant human effort and domain knowledge as
well as lack of generalibility to different applications.
Recent techniques leverage the advances in graph representation learning[11][10][20]
to embed the nodes into low-dimensional embedding spaces that automatically
capture the users mobility and social context, based on the original graph topology and nodes’ attribute. The graph representation learning approach translates
the complex structure of a graph into a latent space. In which, nodes are assigned
to low-dimensional vectors so that the learnt latent space can reflect the topology of the original graph. For example, if there is an edge between two nodes,
these two nodes should be embedded to be close to each other in the embedding
space. If two users are friend, they should be closely embedded into a latent
space, similar phenomenon for other relationships such as user-POI and useractivity. Hence, the friendship prediction is often performed by link prediction
task, by measuring the similarity between two user nodes[10][4] such as cosine
similarity and dot product. Also the POI recommendation task could leverage
these learnt embedding to enhance to prediction ability [4].
3
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep
However, classical graph embedding techniques are unable to capture the complex characteristics in the checkins. A checkin is formed by multiple objects of
various types such as (user, time, POI, category), these objects have a strong relationship. Thus, in order to capture such complex relationship, tradional methods
divide a checkin into pairwise edges and apply simple graph learning process for
pairwise relationships[4]. For example, in order to learn vector representation
for LBSNs graph, the Deepwalk algorithm decomposes a checkin into 6 classical edges, including (user, time), (user, POI), (user, category), (time, POI), (time,
category) and (POI, category)[4]. The transformation is inversible and thus these
pairwise relationships can not describe a checkin and therefore, it leads to downgrading in performance. And moreover, automatic decomposition algorithm generates less informative relationships like (user, time) and (POI, time). Such strong
and indecomposable relationship like a checkin is called a hyperedge, where a hyperedge can contains more than two vertices.
With the indecomposable characteristic of the checkin hyperedges, LBSN2Vec[4]
was proposed in order to capture the complex properties and heterogeneity of
LBSN graph. LBSN2Vec follows random walk based approach, inspired from
the idea of Deepwalk algorithm. In which, LBSN2vec introduces a randomwalk-with-stay scheme to jointly sample friendships and checkins from LBSN
hypergraph, generating sequences of user nodes. In each user node, Yang et
al.[4] randomly sample a set of checkins related to the current user and optimize
the embedding of the user to its checkins and with other friends in its local context
as well. Thus, the LBSN2Vec can capture the relationship between objects of different types. LBSN2Vec produces promising results comparing to other baseline
approaches due to its flexibility in learning the interactions between social friendship and user mobility. However, the strategy of LBSN2Vec is equivalent to the
path-based embedding used in learning heterogeneous graph[21] which can only
learn from the co-occurrence between nodes from a fixed window size. The random walk mechanism is also inversable, that means by using generated sentences
from random walk algorithm can not reconstruct the original graph, therefore,
learning on generated sentences cause information loss and can not fully exploit
the original topology of the graph.
1.3
Research challenges
Though many researches on LBSNs, these approaches are still facing many issues
as they can not fully exploit the structure of LBSN graph. For example, the hand4
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep
crafted features approach can not capture deep semantic and high-order proximity
in LBSN graph, as mentioned in section . Auto feature learning methods consider
both the first-order and higher-order proximity but still results in information loss,
and thus can not fully capture the complex characteristics of LBSN[4]. Learning
LBSN hypergraph is a challenging task due to the complexity of LBSN graph and
downstream tasks requirements, we summary the research challenges in LBSN
with three points:
• Heterogeneity: LBSN data includes many objects of various types, for instance, users, locations, time stamps and semantic activities. Network embedding techniques on LBSNs have to consider about the heterogeneity of
the graph. Classical embedding algorithms like Deepwalk[10], Node2Vec[20],
GraphSAGE[12] can only handle homogeneous graph while nodes belonging to one type only. While nodes having different type, different relationships on graph are made and thus each connection has to be treated different
than others. For example, in LBSN, two common connections are friendship
edges and checkin edges. Since the number of checkins are often very large
comparing to the number of friendships, the influence of checkins into user
embedding has to be different than user-user relationship. For instance, two
users are more likely to be friends if they both go to the gym club at the same
time in a week rather than having 7 mutual friends. Due to often meet at the
gym club, they have a high chance to chat with each other instead of having
some mutual friends but no related between them. Although the number of
mutual friendships and checkins are the same but their influence to user are
different, thus, when generating vector representation for user, the two connections must have different affects on user. How to weighted the influence
from various connections are also a challenging task of heterogeneous graph
embedding techniques.
• Indecomposibility: As mentioned above in section 1.3, consider a checkin
which contains four informations: a user, a time stamp, a POI and a semantic
activity. Those objects have a strong relationship and any decomposing algorithm will cause loss of information. Thus, in order to capture the complex
property from a checkin, classical embedding techniques[10][12], which can
only capture the pairwise relationships, can not be applied.
• Multitask requirement: In LBSNs, there are two essential tasks that have
been widely researched, friendship prediction and POI recommendation.
These tasks reflect the embedding quality of proposed methods, therefore,
5
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep
a new proposed model has to evaluate on the both task. This results in
multi-task training and the problem is balancing the influencing of social
relationship and user mobility information to each other. For example, high
attention on user mobility causes poor results on friendship prediction and
vice versa[4].
Understanding the drawbacks of LBSNs learning methods, together with the remain challenges, this motivates us to propose a novel method that can handle
the heterogenity and indecomposibility of a LBSN hypergraph. The proposed
method can also be trained by multi-task learning in an end-to-end fashion. Details of our proposed model is introduced in the next section.
1.4
Our proposed method
Recent techniques LBSN2Vec on learning LBSN hypergraphs can capture both
the first-order and high-order proximity[4]. However, LBSN2Vec[4] generated
sequences of nodes inspired from the random walk based approach and thus loss
the information of the structural of graph. An improvement of random walk based
approach is the subgraph-based method[21] which captures the structural context
between nodes. Compared with the structures of meta-path from random walk
like LBSN2Vec and Deepwalk, the structural context contains much more semantic information. Example is shown section 2.2.4 where the node a and b both
connecting to the same subgraph and therefore their embedding should be similar.
Motivate from using structural context, we analyse the using of graph neural network into LBSNs. Recently, graph neural network has attracted wide attention
due to its successfully capturing the complex relationship between nodes and
outperforms other techniques in various downstream tasks like node classification and link prediction[11][12][22]. Traditional graph neural network such as
GCN[11] and GAT[22], however, are designed to learn pairwise relationship in
a homogeneous graph. Thus, hypergraph convolution[13] was released to model
the hypergraph and learn the n-wise proximity from hyperedges. We analyse that
the hypergraph convolution could be effective in learning LBSN data since it can
capture the complex characteristic from checkin hyperedges. However, it solves
only one of the three main challenges mentioned above and thus we improve the
hypergraph convolution algorithm for our task on LBSN dataset, which consider
the heterogenity and multi-task learning.
In this work, we propose HC-LBSN, a heterogeneous hypergraph convolution for
6
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep
LBSN task. The heterogeneous hypergraph contains four types of nodes (user,
time stamp, location, category) in the LBSNs task with two types of hyperedges:
friendship hyperedge and checkin hyperedge. Our method follows the encoderdecoder architecture, where the encoder uses several hypergraph convolution layers to learn the node representation for different types of nodes in a unified vector
space. By stacking several hypergraph convolution layers, the encoder can capture the high-order proximity between nodes.
In LBSN2Vec, Yang et al.[4] mentioned that different node types should be embedded into different embedding spaces, a unified space can not reflex the similarity between nodes. For example, consider a famous person who has many friends,
this person also connected to a POI via checkin but none of her friends are linked
to this POI. By learning the embedding of user and POI in a unique space, the
embedding of the famous user is closed to the POI and closed to her friends, at
the same time the embedding of her friends should be far way from that POI due
to no connection. This is conflict. However, in our proposed method, we do not
directly compare the similarity of nodes in this encoding space for downstream
tasks, but using two decoders aiming for the two essential tasks, friendship prediction and POI recommendation. In particular, since the friendship and checkin
are represented by friendship hyperedges and checkin hyperedges, respectively,
predicting missing hyperedge is equivalent to predicting friendship and checkin.
Hence, we use a common method for the both tasks. Specifically, our decoder
first generates hyperedge embeddings and pass them to a non-linear function to
measure a score for the existence of a hyperedge. Higher score value reflects
higher probability for a hyperedge to be exists, and vice versa.
Our proposed method is trained using multi-task learning in an end-to-end fashion. Two essential tasks for LBSNs task are performed to optimize both the encoder and two decoders. Thus, generated node embedding can be used to further
downstream tasks, for predicting new forming social relationship and suggesting POI to user. In order to balance the influence of each hyperedge type into
final node representations, we add weights to hyperedges, adjusting it affect the
embedding quality to be more attentive to social relationship or human mobility.
1.5
Contributions and Thesis Outline
With the proposed approach that considering the three main challenges and handling the drawbacks of traditional methods, we summary our contributions to this
thesis as:
7
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep
• A novel approach that applies hypergraph convolution for the LBSNs task,
thus dealing with the indecomposability of LBSN graph and capturing the
high-order structures of user-checkin relationships.
• The model follows the encoder-decoder architecture which handles the heterogeneity of node embeddings and enables multi-task learning for both predicting friendship and future checkins.
• Extensive experiments illustrate the capability of our proposed model on
capturing deep semantic structures in LBSN hypergraph. Our method outperforms baseline approaches on the two task: friendship prediction and POI
recommendation.
The thesis is organized as follows.
This chapter 1 provides an introduction about our work, including problem introduction of LBSN (section 1.1), baseline methods (section 1.3) and remain challenges in section 1.3. Thus from observations we proposed our novel model to
deal with the challenges in LBSN task (section 1.4).
In the next chapter, chapter 2, we consider more details about the background
knowledges including a summarization about learning on LBSN data which shortly
introduces different approaches on LBSN data (section 2.1). Hence, we introduce in details about referenced embedding techniques in section 2.2, approaching the problem from a top-down view. In this section, we first represent about
the overview graph embedding techniques with three main approaches (section
2.2.1). Therefore, we describe several graph embedding methods which will
be used in experiments and influence the proposed method, including Deepwalk
(section 2.2.2) and Graph Neural Networks (section 2.2.3). Hence, section 2.2.4
approaches more closer into our method, we represent heterogeneous graph learning, a sub-field in graph embedding which handles different types of nodes and
connections in a graph. In this section, we introduce popular methods to capture the complex relationship in a heterogeneous graph. Classical heterogeneous
graphs are still not relevant to LBSN data, thus we describe a more complex graph
data structure called hypergraph, which is better to model LBSN data. Section
2.2.5 provides the definition of a hypergraph and the most effective method to
learn it, named hypergraph convolution. Hypergraph convolution is a kind of
GNNs allowing capture the complex characteristic of hyperedges such as checkins.
Chapter 3 describes our proposed method, in this chapter we first give a short
8
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep
introduction including observations that motivates us to this work. Therefore, we
represent our model in details, also using the top-down view. Particularly, we
first introduce the framework in section 3.1 which illustrates model components
and flows. We then describe each stages of our model including hypergraph construction in section 3.3, then applying learning on constructed graph in section
3.4 using hypergraph convolution. The model is trained in an end-to-end fashion
and the loss function is defined in section 3.5. In the end, we show how to learn
our model in the optimization section (see section 3.7).
After describing our proposed method for learning downstream tasks in LBSNs,
we perform experiments to prove the capability of model in capturing the complex
relationship in LBSN hypergraph. Chapter 4 shows our experiments on common
datasets for evaluating LBSNs. We first provide the settings for experiments in
section 4.1, first introduce about those popular datasets, then how to evaluate the
model with downstream tasks and metrics. We also provide settings for baselise
approaches and configuring hyperparameters for our model. In the next sections,
extensive experiments are performed to show the quality of generated embeddings. We compare our method HC-LBSN with other baseline approaches for
both friendship prediction and POI recommendation, it was mentioned in section
4.2. We then analyse the influence of each model components, several component
candidates, which are difficult to set, are evaluated in the next section, hyperparameter sensitivity 4.4. This section provides a inside view about the influence of
different hyperparameters including the checkin hyperedge weight and the number of hypergraph convolution layers.
Chapter 5 concludes our work and discusses about future work.
1.6
Selected Publications
In this section, we provide several publications that relates to this work and the
background graph embedding.
• Pham Hai Van, Dat Hoang Thanh, and Philip Moore. ”Hierarchical pooling in graph neural networks to enhance classification performance in large
datasets.” Sensors 21.18 (2021): 6070. (Accepted).
• Pham Minh Tam*, Hoang Thanh Dat*, Huynh Thanh Trung and Huynh
Quyet Thang. ”Social multi-role discovering with hypergraph embedding
for Location-based Social Networks.” In 14th Asian Conference on Intelligent Information and Database Systems (2022). (Submitting).
9
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep
This chapter has introduced about the overview of our work, providing neccessary knowledge about my thesis from understanding the problem and motivation
for the proposed method. In the next chapter, we will describe compulsory background before approaching to the proposed method, the background including
learning on LBSN data and the graph embedding techniques.
10
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep
CHAPTER 2. BACKGROUND
2.1
Learning on LBSNs data
Recently, location-based social networks such as Facebook Local, Yelp, Foursquare
or Gowalla have attracted million of users to share their daily experiences with
others. These platforms contain great pool of information and mostly accessible to the public. To this end, many researches attempts to learn the underlying
patterns of user behaviour in LBSNs [15]. The earlier techniques often leverage
hand-crafted features such as daily routines [18], [23], dispersion metric [19] and
applies heuristics to retrieve the needed insight about the users. For example,
Wang et al. [5] observes that friends often have shared communication activities,
thus they define a threshold and determine that two people potentially be friends
if the number of shared activity between them being greater than the threshold.
Yang et al. [7] characterises user mobility based on two criteria: the total timeindependent travel distance and the probability of returning to particular locations. Song et al. [18] uses a metric called mobility entropy to reflect users’ daily
mobility. Backstrom and Kleinberg [19] proposed dispersion metric to estimate
the tie strength between the connected users and detect the couples and romantic
partners by their strong social bond pattern. However, the feature engineering
requires significant human effort and expert knowledge.
2.2
2.2.1
Graph Embedding Techniques
Overview
In this section, we represent about the learning process on classical graph. Let
G(V, E) is a directed graph where V is the set of nodes, E is the set of edges and
the matrix X ∈ R|V |×d is the node feature matrix, d is the number of features.
For example, in social network, V denotes the set of users and edges indicate the
social friendship between them. The personal information of users such as ages,
gender and hobby can be encoded into node feature matrix.
Most existing graph embedding approaches focus on preserving pairwise relationship between nodes in a graph. The graph embedding techniques can be
divided into three main approaches[24]:
• Matrix factorization based: These methods represent the connections in the
11
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep
graph, either first-order proximity (e.g. adjancency matrix) or higher-order
promixy (e.g. the similarity of the structural context between two nodes) in
the form of matrix. Thus, the node embeddings are learned by factorizing
this matrix. Most of these methods only exploit the nodes and their relationship which illustrated by edges, they are unable to aggregate the information
from the node features. Hence, these methods are not widely researched in
the last 3 years. Most popular matrix factorization methods can be listed as
GraRep[25], TADW[26] and HOPE[8],...
• Random walk based: Random walk based approaches are able to capture
more complex relationship comparing to matrix factorization approaches.
By applying random walk mechanism, a input graph is thus transformed into
a list of sentences. Each sentence is a sequence of nodes. This idea is inherited from the word embedding process on documents by treated nodes as
words and the representation of node is learned through predicting the its local context (predicting its neighbor nodes). Deepwalk[10] is the first method
that applies random walk mechanism on a graph to capture the structure of
graph. The random walk mechnism used in Deepwalk is completely random,
does not consider the edge weights between nodes. Thus, Node2Vec[20]
improves the random walk mechanism by exploiting the edge weight information. The biased random walk mechanism proposed in Node2Vec allows capturing more flexible structure by adjusting the hyperparameters p
and q that tunes the weight between two traveral strategies BFS and DFS. It
should be noted that, the random walk based methods can only applied on
non-attributed graphs.
• Deep learning based: Due to the successfully of deep learning models on various tasks. The deep learning is also applied on node representation learning.
Some popular methods can be listed as SDNE[27], VAE[28] and the most
famous method, which is widely used recently, is the Graph Neural Network
(GNNs). Some variants based on GNNs such as GCN[11], GraphSAGE[12]
and GAT[22] enables capturing the structural of a graph by various aggregation functions. More details on Graph Neural Network is represented
in section 2.2.3. This approach outperforms traditional methods on many
downstream tasks[12][22] due to it capability to capture and aggregate the
information from node features and relationships in a graph. In particular,
the GNNs allow learning from both the node features and edges in a graph.
Moreover, they also can exploit the edge features information and handle
heterogeneous graphs[21].
12
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep
In the next section, we represent some graph embedding techniques including
Deepwalk and GNNs. Deepwalk algorithm is a popular method in random walk
based approach, it is also used in our experiment section (see section 4.2), therefore, in the next section we introduce about the Deepwalk algorithm. We also
introduce the general process of GNN algorithms following the message-passing
mechanism due to their successful in learning network embedding.
2.2.2
Deepwalk
Though GNN approaches often perform better than random walk based method
on various downstream tasks such as node classification, node clustering [12][22],
the Deepwalk algorithm is still widely used due to its low computation cost comparing to GNNs. In this work, the Deepwalk algorithm is also used as comparison
baseline approach, thus, in this section we introduce in details about Deepwalk algorithm, in order to show that they are unable to capture the complex relationship
in a hyperedge which will be mentioned later in section 4.2.
Deepwalk inherits the idea from the Skip-gram model[29] [30] that enable learning node representation based on its context. The Deepwalk algorithm transforms
a graph into sentences using random walk mechanism and the learning process is
then applied on generated sentences. Given a sentence generated from a random
walk algorithm s = {v1 , v2 , ..., vL } where L is the walk length. Based on the SkipGram, Deepwalk learns the node representation vi by predicting its local context
(predicting its neighbor nodes). The objective function is illustrated as belows:
minf − log P r({vi−t , ..., vi+t }|f (vi ))
P r({vi−t , ..., vi+t }|f (vi ) =
i+t
Y
P r(vj |f (vi ))
(2.1)
(2.2)
j=i−t,j̸=i
where, {vi−t , ..., vi+t } \ vi is the context of node vi , t is the window size. After
learning process, nodes which have similar context in the generated sentences
will be closed in the embedding space. For example, in fact if the word x and y
both appear in the same context, it means that x and y is synonyms so x and y can
replace each other in a sentence. On a graph, having similar structural context
means having many common neighbors so the embeddings of nodes that have
many common neighbors should be embedded close to each other in embedding
space. Reversely, if two nodes have very different context then their distance is
13
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep do an to nghiep docx 123docz
luan van hay luan van tot nghiep