1 of 105

AI-as-a-Service on Blockchain

�Steve Liao

2 of 105

12小時前寫完的code大家已經在用了:�

FinTech 世界:All-time, All-place, All-provider: �高度競爭,用code 改變世界者的世界。

不再是“有想法,沒辦法”的世界

文化最重要:�做中學 玩真的: Tech�跨領域: Fin

3 of 105

廖世偉博士

  • 廖世偉博士的志業是通過創新,產品,Open Source 來教育服務, 改變世界。
  • 廖博士 (PhD Stanford) 在斯坦福大學、英特爾、谷歌公司工作了22年,獲得谷歌內部頒發的最高榮譽獎:創始人獎(Founder's 獎)。
  • 2013年廖博士從谷歌退休回到斯坦福大學教授程序分析和優化,也在台灣大學教授 最新的安卓系統和金融科技大數據課程。
  • 天龍八部(such as 臺大幫幫忙) on Gcoin 區塊鏈。
  • 台大黑客松:在天龍八部之前的創新機制,與時俱進。

4 of 105

General AI vs. Specific AI

  • 不只做 specific domain 的 machine learning,而是學到 general 的知識
    • 就像 Google 一樣:AlphaGo and DeepMind examples
  • General AI 仍然需要透過某個 domain 來驗證
    • General AI 通常會找很難的領域,例如 playing GO
  • General AI: 研究方向應該朝著讓 Machine自己解出方法來而不是靠人類告訴他該怎麼擷取。
    • Specific AI vs. General AI:�Feature selection vs. Genuine Machine intelligence
    • Domain expert vs. Computer Science

5 of 105

AlphaGo

  • Policy Network
  • Value Network
  • Monte Carlo
  • Re-enforcement learning

AlphaGo experience is General AI

  • It sheds light on what is intuition, what is intelligence, what is insight.
  • 神來之筆

6 of 105

AlphaGo of 2016 vs. Deep Blue of 1997

  • General Intelligence vs. Brute Force
    • AI vs. Hardware game
    • Intelligence vs. Alpha-beta search

AlphaGo experience is General AI

  • AlphaGo sheds light on what is intuition, what is intelligence, what is insight.
  • 神來之筆

7 of 105

母雞帶小雞

  • General AI vs. Specific AI

8 of 105

Agenda

  • Specific AI case study:
    • Our deep learning for MRI-analytics-and-prediction
      • Domain knowledge and feature selection is key
  • Specific AI + General AI case study:
    • Our 理財機器人

9 of 105

Our 理財機器人

  • 我們理財的算法主要分兩部分,OLMAR跟RNN。
    • OLMAR algorithm requires domain knowledge to perform portfolio selection.
      • OLMAR 將財務知識演算法化
    • Our RNN
    • Deep Learning 的部分是在RNN,RNN還是 general的解法,沒有用到 domain knowledge。

10 of 105

Our 理財機器人

  • 研究方向主要是希望能利用 domain knowledge 來將 data做初步處理 (OLMAR部分)
  • 再將處理過後的 data交給 RNN : Deep learning部分
  • Generalization: 只要其他 scenerio的 data可以透過演算法整理成 sequence的形式,就能將 data用 deep learning處理

11 of 105

理財機器人’s trends: Chatbot and 深度理財

  • Chatbot: Virtual assistant
    • Key technologies:
      • Natural Language Processing
      • Self-Learning
  • 深度學習 for 理財

12 of 105

Chatbot

12

13 of 105

Every business should have an APP

14 of 105

Every business should have a APP

Chatbot

15 of 105

#1 Access the 1 billion users platform

Over 1 billion users

Over 220 million MAU

67 % said they expect to message with business in next 2 years

53 % said they are more likely to buy with a business they can message directly

16 of 105

#2 Everyone hates the bothersome navigations

83% think it’s hard to find what they want efficiently in the bank website

Hi, I’d like to apply for a credit card

Dear Kevin, here is the application form for you.

Please fill in and return to us.

17 of 105

#3 Everyone needs a personalized service

Pros:

  • Connected with the bank service
  • Access account information in chat interface

Cons:

  • Lack of AI
  • Only support naïve commands (1, 2,…)

18 of 105

Chat Platforms

Facebook, Line, SMS…

Natural Language Understanding

Cooperate with

Bank API

Natural Language Generation

Our AI Technology

“How much can I spend on my credit card?”

Intent: “ASK QUOTA”

Quota: $200

Hi David, you have $200 to spend before 1st Dec. Happy to organize a credit limit.

19 of 105

Demo

19

20 of 105

深度學習 for 理財機器人

20

21 of 105

深度理財 Outline

  • Overview
  • Problem Formulation
  • Related Work
  • Methodology
    • OLMAR
    • RNN
    • OLMAR+RNN (LSTM)

21

22 of 105

深度理財 Outline (cont.)

  • Experiments
    • Data Description
    • Platform
    • Result
  • More Applications
  • Future Work
  • Summary of 深度理財

22

23 of 105

Price Prediction

  • Prediction on price changing for improving financial investment has been a hot topic for recent years.
  • There are a bundle of methods claimed to be efficient while being used on predicting price of stocks or other financial merchandise.
  • Several algorithms are then introduced to help predict price trend.

23

24 of 105

OLMAR Introduction

  • We want to know that these algorithms are really useful or not. Even if they are, is there any possibility to make it even better?
  • OLMAR algorithm is a widely-recommended algorithms and it does have a good performance on helping people pick the right portfolio.

24

25 of 105

100% Return Rate Possible?

  • Several algorithms also seem to be efficient.
  • Our work is to try to compare OLMAR with other algorithms, and furthermore, combine it with Deep Learning.
    • If it works, it can help us get 100% return per year (while neglecting transaction fee)

25

26 of 105

Portfolio Selection & Online Learning

  • Portfolio selection stands for a combination of several financial investment.
  • As the saying goes, “don’t put all your eggs in one basket”.
  • Research on portfolio’s behaviour is therefore more attractive than that on a single stock.
  • Also, since the data come in a sequential order, online learning fits well in the scenario.

26

27 of 105

Problem Formulation (1/3)

  • Here’s our parameter setting: On the t th period, the assets’ prices are represented by a close price vector p(t) ∈ R
  • Each element p(t,i) represents the close price of asset i.
  • The price changes are represented by a price relative vector x(t) ∈ R, and x(t,i) = p(t,i)/p(t−1,i).

27

28 of 105

Problem Formulation (2/3)

  • An investment on the t th period is specified by a portfolio vector b(t) = (b(t,1), ..., b(t,m)), where b(t,i) represents the proportion of wealth invested in asset i.
  • We set b(1) = (1/m)*1, where m is the amount of assets.
  • On the t th period, a portfolio b(t) produces a portfolio period return s(t), that is, the wealth increases by a factor of s(t) = b(t) * x(t) = ∑(m, i=1)b(t,i)*x(t,i).

28

29 of 105

Problem Formulation (3/3)

  • GOAL: Optimize the final cumulative wealth, which is the S(T).

29

30 of 105

Related Work (1/2)

  • Empirical results show that mean reversion, which assumes the poor stock may perform well in the subsequent periods, may better fit the markets.
  • Indeed, algorithms based on this assumption has good performance on specific datasets.

30

31 of 105

Related Work (2/2)

  • However, those algorithms[2][3] do not perform well on most datasets, since they only rely on a simple assumption that the predicted next price relative x˜(t+1) will be inverse proportional to last price relative x(t).
  • We got two key points: poor stock has a higher possibility to perform well in the subsequent periods, but at the same time we can’t simply assume that x(t+1) will be inverse proportional to x(t).

[2]: Bin Li. Passive Aggressive Mean Reversion

[3]: Bin Li. Confidence Weighted Mean Reversion

31

32 of 105

Methodology-OLMAR (1/8)

  • On-Line Moving Average Reversion (OLMAR)[4] algorithm
    • Based on the conclusion we got on the previous page, On-Line Moving Average Reversion(OLMAR) algorithm presents several new concepts.
    • The essential idea is to exploit multi-period moving average (mean) reversion.

[4]: Bin Li. On-Line Portfolio Selection with Moving Average Reversion. 2012

32

33 of 105

Methodology-OLMAR (2/8)

  • On-Line Moving Average Reversion (OLMAR) algorithm
    • Rather than p˜(t+1) = p(t−1), OLMAR assumes that next price will revert to Moving Average (MA) at the end of t th period
    • That is, p˜(t+1) = MAt (w) = 1/w*∑(i=t, i=t−w+1) p(i) , where MAt(w) denotes the Moving Average (MA) with a w-window

33

34 of 105

Methodology-OLMAR (3/8)

  • On-Line Moving Average Reversion (OLMAR) algorithm
    • We propose a price relative vector following the idea of “Moving Average Reversion” (MAR), which is as shown below:

x˜(t+1, w) = MAt (w)/p(t) = 1/w*(p(t)/p(t) + p(t−1)/p(t) +···+ p(t−w+1)/p(t)) = 1/w(1 + 1/x(t) +···+ 1/⊗(w−2, i=0)*x(t−i))

34

35 of 105

Methodology-OLMAR (4/8)

  • On-Line Moving Average Reversion (OLMAR) algorithm
    • Based on the expected price relative vector, we further adopt the idea of an effective online learning algorithm PA[5].
    • PA stands for Passive Aggressive learning.
    • If the classification is correct, PA passively keeps the previous solution
    • If the classification is incorrect, it aggressively approaches a new solution

[5]: Crammer, K. Online passive-aggressive Algorithms

35

36 of 105

Methodology-OLMAR (5/8)

  • On-Line Moving Average Reversion (OLMAR) algorithm
    • Part 1.

36

37 of 105

Methodology-OLMAR (6/8)

  • On-Line Moving Average Reversion (OLMAR) algorithm
    • Part 2.

[6]: J. Duchi. Efficient projections onto the l1-ball for learning in high dimensions. In ICML’2008

37

38 of 105

Methodology-OLMAR (7/8)

  • On-Line Moving Average Reversion (OLMAR) algorithm
    • The solution of OLMAR without considering the non-negativity constraint is b(t+1) = b(t) + λ(t+1)*(x˜(t+1) − x¯(t+1)*1)
    • Note that x¯(t+1) = 1/m*(1* x˜(t+1)) denotes the average predicted price relative and λ(t+1) is the Lagrangian multiplier calculated as below

λ(t+1) = max{0, ϵ − (b(t)*x˜(t+1))/(x˜(t+1)−x¯(t+1)*1^2)}.

38

39 of 105

Methodology-OLMAR (8/8)

  • On-Line Moving Average Reversion (OLMAR) algorithm
    • Empirical observations of the parameter sensitivity show that the final performance is sensitive to the parameter w.
    • For each period, we treat the individual OLMAR with a specified w ≥ 3 as an expert and combine multiple experts’ portfolios weighted by their historical performance.

39

40 of 105

Methodology-RNN (1/3)

  • What makes Recurrent Networks so special?
    • From sequence to sequence

�from: http://colah.github.io/posts/2015-08-Understanding-LSTMs/

40

41 of 105

Methodology-RNN (2/3)

  • How to deal with long-term memory
    • LSTM

from: http://colah.github.io/posts/2015-08-Understanding-LSTMs/

41

42 of 105

Methodology-RNN (3/3)

  • Trial #1:
    • using x(t-n)~x(t) to predict x(t-n+1)~x(t+1)
    • REAL VALUED
  • Trial #2:
    • Classify each x(t) into two categories:
      • if x(t)>1 then x(t) = 1
      • else x(t) = -1
    • using x(t-n)~x(t) to predict x(t-n+1)~x(t+1)
    • CATEGORICAL

42

43 of 105

Methodology-Combined (1/2)

  • Trial #1:
    • First half: OLMAR+training LSTM
    • Second half: predicting LSTM

43

44 of 105

Methodology-Combined (2/2)

  • Trial #2:
    • First half: OLMAR+training LSTM
    • Second half: Using predicted labels as the sign to whether or not follow the mean reversion.
      • if x(t+1) = -1:
        • if MA(w)/p(t) > 1: x(t+1) = p(t)/MA(w)
        • else: x(t+1) = MA(w)/p(t)
      • if x(t+1) = 1:
        • if MA(w)/p(t) > 1: x(t+1) = MA(w)/p(t)
        • else: x(t+1) = p(t)/MA(w)

44

45 of 105

Insight

  • 如果 Specific-Domain-AI 說要跌,但 General AI (RNN) predicts 要漲,這時是一種 intelligence 的可能。
    • Follow General AI 的方向:改成要漲。這是 Passive Aggressive 中的 Aggressive 部份。
      • 漲幅還是靠 OLMAR+,只是換個方向。
  • 如果 Specific AI and General AI agree with each other,then 就用 Specific AI 即可。
    • 這是 Passive Aggressive 中的 Passive 部份。�
  • In a way, our OLMAR+RNN is similar to that part of AlphaGo design.

45

46 of 105

Experiments

46

47 of 105

Experiment (Data Description)

47

Dataset

Region

Time frame

# days

# assets

NYSE(O)

US

3/7/1962 - 31/12/1984

5651

36

NYSE(N)

US

1/1/1985 - 30/6/2010

6431

23

DJA

US

1/1/2001 - 14/1/2003

507

30

TSE

CA

4/1/1994 - 31/12/1998

1259

88

48 of 105

Experiment (Platform)

  • Platform
    • We use Python code to construct the whole project.
    • Library we use:
      • Tensorflow (for training RNN)
      • Numpy, Pandas (for developing the trading system)
    • Machine: GTX980 with 4GB RAM

48

49 of 105

Experiment (Result trail #1 1/3)

Dataset: NYSE(O)

FAIL! Because the given features are not strong enough to give the exact price prediction

49

50 of 105

Experiment (Result Trial #2 2/3)

Dataset: NYSE(O)

50

Algo.

Final wealth

Annualized return

Sharpe ratio

Winning days

BAH

14.211138

12.56%

0.8

51.8%

CRP

26.677763

15.77%

1.09

53.5%

OLMAR

7.656836e+16

466.14%

3.14

58.3%

OLMAR+RNN

1.689423E+17

486.48%

3.20

58.5%

51 of 105

Experiment (Result Trial #2 3/3)

Dataset: NYSE(N)

51

Algo.

Final wealth

Annualized return

Sharpe ratio

Winning days

BAH

18.230572

12.05%

0.63

53.5%

CRP

31.823353

14.52%

0.7

54.6%

OLMAR

4.209890e+08

117.74%

1.39

55.0%

OLMAR+RNN

1.144887e+09

126.45%

1.45

55.2%

52 of 105

Experiment (appendix)

Parameters for LSTM with 14 epoches

Around 2 Hrs on training

52

53 of 105

More Applications

  • For stock market trading, the combined algorithm is applicable for high frequency trading. e.g. training at night and prediction in the trading period.

53

54 of 105

Future Work

  • Combine transaction fee
  • More dataset to prevent survivorship bias
  • Parameter tuning for LSTM

54

55 of 105

Summary of 深度理財

  • Our proposed algorithm pays attention not only to the mean reversion part (follow the loser), but also mean advancement (follow the winner).

  • So far, the performance is better than OLMAR but the computational time is a lot longer.

55

56 of 105

AI-as-a-Service Edge Platform (Cognitive Edge Platform) on Blockchain Platform

57 of 105

57

Blockchain Platform for Cognitive Edge Computing

Edge Computing for IoT devices

58 of 105

Edge Computing for IoT devices

58

cloud server

devices

data consumers

Cloud Computing

IoT devices

data producers & consumers

cloud server

Cloud Computing for IoT

cloud server

IoT devices

Edge

device

IoT devices

Edge

device

IoT devices

Edge

device

data producers & consumers

Edge Computing for IoT

59 of 105

Edge Devices in Edge Computing

59

cloud server

IoT devices

Edge

device

IoT devices

Edge

device

IoT devices

Edge

device

  • Computing offload
  • Data caching/storage
  • Data processing
  • Distribution request
  • Service delivery
  • IoT management
  • Privacy protection

60 of 105

Collaborative Edge

60

cloud server

IoT devices

Edge

device

IoT devices

Edge

device

IoT devices

Edge

device

Collaborative

Edge

Edge

device

Edge

device

Edge

device

  • Edge devices can collaborate in local 

by predefined API protocol between edge devices

  • Information can be fused for synergy and efficiency

  • Example:
    • Government
      • City/country-level issues
      • Disaster management

    • Company (HQ, branch)
      • Management in different sites

Example:

Hospital

Pharmacy

Government

Financial

Fire dept.

Rescue system

W. Shi, J. Cao, Q. Zhang, Y. Li and L. Xu, "Edge Computing: Vision and Challenges," in IEEE Internet of Things Journal, vol. 3, no. 5, pp. 637-646, Oct. 2016.

61 of 105

61

Blockchain Platform for Cognitive Edge Computing

Cognitive Edge Computing

62 of 105

Deep Learning (Example: Google FaceNet)

62

Deep

Learning

inference

models

models

Alan

Betty

Adam

Barbara

Alan

Betty

Adam

Barbara

967A48

01A611

9B3400

96B259

967A48 = Alan

01A611 = Adam

983400 = Betty

96B259 = Barbara

Face Feature Vector List

C91369

97C467

?

?

C91369 = Carrie

97C467 = Carl

(register new records )

(New Faces)

Carrie

Carl

Carrie

Carl

63 of 105

Deep Learning in Cognitive Edge Computing

63

Cloud server

Cloud server

IoT

device

Edge device

Cloud server

IoT

device

inference

models

  • Privacy Concern
  • Data Transfer Issue
  • Network Issue

Incremental

learning

Incremental

models

Incremental

models

Data

De-identification

De-identified

data for learning

inference

models

deep

learning

models

models

deep

learning

models

raw data

for learning

Face

(family and friends)

Personal information

(calendar, contact..)

Medical data

Geography info

(house, office,..)

64 of 105

Example of Model Size (Google FaceNet)

64

Incremental Learning

on Edge Device

Resource for learning

Learning Time

Model size

Deep Learning

on the Cloud

Not reduced:

  • 140M parameters
  • 1.6B FLOPs per image

Reduced in size

  • 6.6M ~ 7.5 M parameters
  • 500M~1.6B FLOPs per image

~ 700 hour

Models:

  • 7.5 M Parameters
  • 128-bytes per face

Incremental Learning

on Edge devices

Incremental Learning:

(Personal Photo test)

  • 20M FLOPS

(NNS2 tiny Inception model)

Incremental Learning:

(Personal Photo test)

30ms per image

on smartphone

Incremental Models:

(only face feature vector)

  • 128-bytes per face.

REF: Florian Schroff, Dmitry Kalenichenko, James Philbin, “FaceNet: A Unified Embedding for Face Recognition and Clustering” in Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition 2015. Cite as: arXiv:1503.03832 [cs.CV] (or arXiv:1503.03832v3 [cs.CV] for this version), Submitted on 12 Mar 2015 (v1), last revised 17 Jun 2015 (this version, v3), https://arxiv.org/pdf/1503.03832v3.pdf

65 of 105

Collaborative Edge in Cognitive Edge Computing

65

Collaborative

Edge

Cloud server

deep

learning

models

Edge device

Incremental

models

IoT

devices

Human

Resource

Access

Control

IoT

devices

IoT

devices

Surveillance

Incremental

learning

Edge device

Incremental

models

IoT

devices

Human

Resource

Access

Control

IoT

devices

IoT

devices

Surveillance

Incremental

learning

Edge device

Incremental

models

IoT

devices

Guest

Access

Control

IoT

devices

IoT

devices

Surveillance

Incremental

learning

HQ

Oversea Branch

Hotel

Example :

66 of 105

Collaborative Edge for a Company

66

Company HQ

CEO

RD

OP

HR

Manager

MKT

Visitor

Visitor

Oversea Branch

Hotel

Company Guest

Regular Guest

Valid until

2016/10/30

Valid until

2016/10/30

Delete after check-out

67 of 105

Requirement for cognitive edge computing

67

Require a method to management “incremental models” among edge devices

De-centralized, Secure, Traceable, Immutable, Efficient, Flexible for configuration

Edge Computing

offload cloud server in IoT era

Edge

Edge

Edge

cloud

Collaborative Edge

edge devices share information peer-to-peer

Edge

Edge

Edge

cloud

Cloud

deep

learning

models

IoT

devices

raw data

Deep Learning on Cloud

models generated on cloud

Cloud

Deep

learning

models

Incremental

learning

Incremental

models

Edge

IoT

devices

Incremental Learning

Incremental models gen. on edge

68 of 105

68

Blockchain Platform for Cognitive Edge Computing

Blockchain as de-centralized database

69 of 105

Brief introduction to Blockchain

  • Blockchain is from Satoshi’s paper in 2008:

“Bitcoin: A Peer-to-Peer Electronic Cash System”

  • Blockchain is a trust machine that changes how the world functions

(Economist 2015-10-31)

  • Blockchain can become the ledger in the internet era

(Wall Street Journal 2015-1-25)

  • By 2027 10% of GDP is on blockchain

(World Economic Forum, Davos 2016)

69

70 of 105

Blockchain

70

Immutable

Security

Transaction

is Settlement

Traceable

Ver.

71 of 105

How a �Blochchain�Works

71

Source: Financial Times

  • “Wallet” is an account to store records

  • “Coin/Token”

are resource can

be transferred

72 of 105

How blockchain keeps records Immutable

72

Block Header

Hash

Previous Block Header Hash: A hash in internal byte order of the previous block’s header. This ensures no previous block can be changed without also changing this block’s header.

Merkle Root Hash: The merkle root is derived from the hashes of all transactions included in this block, ensuring that none of those transactions can be modified without modifying the header.

Target nBit: An encoded version of the target threshold. This block’s header hash must be less than or equal to.

Nonce: An arbitrary number miners change to modify the header hash in order to produce a hash below the target threshold‧

txn_count: The total number of transactions in this block

Txns: Every transaction in this block, one after another, in raw transaction format.

Block Header Hash

Hash of Block Header

Block Header Hash

Hash of Block Header

Magic No.

Block Header

Time

Target

nBits

Nonce

Txns

raw

transactions

Block n

Block Size

Previous

Block

Header

Hash

Merkle

Root

Hash

txn_count

Magic No.

Block Header

Time

Target

nBits

Nonce

Txns

raw

transactions

Block (n+1)

Block Size

Previous

Block

Header

Hash

Merkle

Root

Hash

txn_count

Ver.

Ver.

73 of 105

What kind of Trust Machine is it?

73

Decentralized

Trust Machine

Security

Immutable

Shared

Inspectability

Anonymous

Irreversible

Trust Machine

supports

  • Privacy, Security: CoinJoin, CoinShuffle, Zero-knowledge proof, Coin Mix, Mix server, 51% attack
  • Math: One way function
  • Crypto: Hashing/Signature/ECC/SHA-256
  • Economics: Monetary policy/Game theory
  • Consensus: Byzantine Generals problem/Hashcash
  • Scalability (Global)
  • Flexibility (Governance structure, smart contract)

74 of 105

State machine-based immutable decentralized Trust Machine

74

state

state

state

utxo

utxo

utxo

utxo

utxo

utxo

utxo

utxo

utxo

utxo

utxo

utxo

utxo

utxo

utxo

utxo: unspent transaction output

How the Internet and blockchain both work. Ledger is synchronized by “full” (red) nodes.

75 of 105

75

Blockchain Platform for Cognitive Edge Computing

Gcoin Blockchain

76 of 105

Consortium/Permissioned Blockchain

76

Source: Financial Times

77 of 105

Gcoin Blockchain

77

Multi-tier

multi-centered

• Gcoin adopts permissioned blockchain and a multi-role structure

• ‘G’ represents ‘Governance’ and ‘Global,’ �• Multi-role in Gcore network.

◇ Alliance is responsible for mining and creating blocks.

Alliance member has the power to license issuers.

◇ Issuers have the power to issue currencies/coins/tokens.

◇ Users can use one or more currencies/coins/tokens.

Multicurrency

• Multiple currency/coin/token can coexist on Gcoin network.

• Minting cap for a single currency/coin/token is 10 to the power of 10 (10^10).

Smart Contact

Gcoin provides smart contract capability, a simple stack-based script in transactions. 

Unlike Bitcoin, Gcoin allows more flexibility for its customers.

Improve block creation time

• Gcoin's 15 second block creation time increase tps (transactions per second) substantially.

• Block is only created when there is a transaction. No block is created when there is no transaction. 

Avoid “51% attack”

• Previous Bitcoin blockchain uses Proof-of-Work (POW). If a single node contributed the majority of mining power (>=51%), it can manipulate the blockchain.

• Gcoin blockchain can increase mining difficulty of the nodes that successful mining frequently to avoid those nodes to dominate blockchain.

78 of 105

78

Alliance

full node

full node

full node

full node

full node

Alliance

79 of 105

79

Alliance

issuer

issuer

full node

full node

full node

Alliance

80 of 105

80

alliance

member

alliance

member

alliance

member

81 of 105

81

alliance

member

alliance

member

alliance

member

issuer

issuer

issuer

issuer

82 of 105

82

alliance

member

alliance

member

alliance

member

issuer

issuer

issuer

issuer

member

member

member

member

member

member

member

member

member

member

member

member

83 of 105

83

Blockchain Platform for Cognitive Edge Computing

Gcoin Blockchain Platform

for Edge Computing

84 of 105

Gcoin Blockchain Platform for Edge Computing

84

Edge Computing

Management

Gcoin

Blockchain

Management

Token: resource records

Smart contact : resource handling

85 of 105

85

Hospital2

Government

Alliance Member

Medical

Alliance Member

Rescue system

Alliance Member

Approval process

Token Issuer

Financial

Token Issuer

Rescure resource

Token Issuer

Medical resource

Token Issuer

Ministry of

Finance

City1

Finanal Dept.

City3

Finanal Dept.

Ministry of

Health and

Welfare

Hospital1

Hospital3

Central

Gov.

City3 Gov

City1 Gov.

Fire

System

Military

System

Police

System

City2 Gov

City2

Finanal Dept.

Collaborative Edge Devices

Gcoin Blockchain Platform for a Government

86 of 105

86

Collaborative Edge Devices

Company Management

Alliance Member

Human Resource

Alliance Member

Supply Chain Alliance Member

Approval process

Token Issuer

Financial Token Issuer

Supply Chain Management

Token Issuer

Employee Management

Token Issuer

Financial

Accounting

Business Units

Human

Resource

Access Ctrl

Business

Units

Excutive

Legal

Business

Units

Production

MKT/Sales

Warehousing

Face

Token

Gcoin Blockchain Platform for a Company

87 of 105

“Face Token”

87

Company HQ “wallet”

CEO

RD

OP

HR

Manager

MKT

Visitor

Visitor

Oversea Branch “wallet”

Hotel “wallet”

Company Guest

Regular Guest

Valid until

2016/10/30

Valid until

2016/10/30

Delete after check-out

“Smart contact”

of Gcoin blockchain

to handle management

Use Token management for collaborative Edge devices management

88 of 105

Gcoin Blockchain as Platform for Cognitive Edge Computing

88

Gcoin Blockchain Platform

  • Token for resource records
  • Smart contract for resource management / automatic execution

Edge Computing

offload cloud server in IoT era

Edge

Edge

Edge

cloud

Collaborative Edge

edge devices share information peer-to-peer

Edge

Edge

Edge

cloud

Cloud

deep

learning

models

IoT

devices

raw data

Deep Learning on Cloud

models generated on cloud

Cloud

Deep

learning

models

Incremental

learning

Incremental

models

Edge

IoT

devices

Incremental Learning

Incremental models gen. on edge

89 of 105

Summary of Blockchained AI

  • Cognitive Edge Computing
    • For privacy concern, incremental models can be learned on edge devices.
    • A solution is needed to management “incremental models” among a group of edge devices
      • De-centralized, Secure, Traceable, Immutable, Efficient, Flexible for configuration

  • Blockchain Platform for Collaborative Edge
    • Blockchain is a candidate solution
      • Blockchain can be a de-centralized, secure, traceable and immutable solution
    • Gcoin blockchain
      • Gcoin blockchain provides more advantages of efficiency, flexibility for configuration and management

  • Use Gcoin Blockchain for cognitive edge computing for collaborative edge
    • Transfer edge device management problems to Gcoin “Token” management
    • For example, a new “Face token” for incremental models of face recognition
    • Smart contract can be used for automatic execution

89

90 of 105

數位經濟美麗新世界的展示

91 of 105

區塊鏈點數平台

  • 各機構間使用區塊鏈,作到高效、快速的交易共同信任機制
  • 第三方支付帳聯網
    • 一般用戶可在行動/第三方支付系統間互聯互通
    • 銀行/業者可快速連結帳聯網,享有互聯互通帶來的網路效應,將資源專注於提供創新用戶體驗與服務
  • 供應鏈金融帳聯網
    • 營運資金和現金流量的實時監控和管理
    • 實現營運資金及現金流的靈活調度

帳聯網 API

92 of 105

數字資產交易平台

  • 具有強大可追蹤性與延展性的數字資產商品交易平台
    • 實物商品:電影票、演唱會、藝術品
    • 非實物商品:貸款商品、股權商品、債權商品、智慧財產權
  • 平台運營商 (Platform operator)、網路商家 (Merchant) 互聯互通
    • 提供平台運營商互相加盟的環境,例如:票據投融平台。提供網路商家可同時在多個平台運營商之間獲得客戶

93 of 105

區塊鏈:點數管理平台及交易平台

  • 區塊鏈數位經濟還有大數據好處
    • 大數據預測分析 – 依據用戶行為做定向廣告
    • 商務情報 – 提供對業務經理人具洞見的指標指數

94 of 105

GCOIN 區塊鏈使用情境

私募平台

票據平台

去中心化交易所

crowdsourcing平台

顧客忠誠計劃

支付結算系統

95 of 105

95

THANK YOU!

Blockchain Platform for Cognitive Edge Computing

96 of 105

Blockchain: �State-Machine based Trust Machine

Remember: blockchain machine is Mathematics-�based:

  • Math just works;

You don’t need to trust it:

Trust Machine vs. Trust Company

97 of 105

What kind of Trust Machine is it

  • To database researchers:
    • Blockchain is NOT just a brand new data store.
  • To computer network researchers:
    • Blockchain is NOT just a P2P network.
  • To virtual machine and compiler professors:
    • Blockchain is NOT just a smart contract.
  • To programming language professors:
    • Blockchain is NOT just a programming platform.

Blockchain’s key technology: State-Machine based immutable Trust Machine

  • Don’t jump to database, network, … applications.

98 of 105

Data structure of Blockchain

Block Header

Hash

Previous Block Header Hash: A hash in internal byte order of the previous block’s header. This ensures no previous block can be changed without also changing this block’s header.

Merkle Root Hash: The merkle root is derived from the hashes of all transactions included in this block, ensuring that none of those transactions can be modified without modifying the header.

Target nBit: An encoded version of the target threshold. This block’s header hash must be less than or equal to.

Nonce: An arbitrary number miners change to modify the header hash in order to produce a hash below the target threshold‧

txn_count: The total number of transactions in this block

Txns: Every transaction in this block, one after another, in raw transaction format.

Block Header Hash

Hash of Block Header

Block Header Hash

Hash of Block Header

Magic No.

Block Header

Time

Target

nBits

Nonce

Txns

raw

transactions

Block n

Block Size

Previous

Block

Header

Hash

Merkle

Root

Hash

txn_count

Magic No.

Block Header

Time

Target

nBits

Nonce

Txns

raw

transactions

Block (n+1)

Block Size

Previous

Block

Header

Hash

Merkle

Root

Hash

txn_count

99 of 105

“Mining” Blockchain (Proof-of-Work)

Previous

Block

Header

Hash

Block Header

Previous

Block

Header

Hash

Block Header

Previous

Block

Header

Hash

Block Header

Block n

Block Header Hash

Block (n-1)

Block Header Hash

Block (n-2)

Block Header Hash

Block Header Hash

Current Blockchain

New Block (n+1)

Block Header Hash

< Target nBits ?

Set

new

Nonce

value

No

Yes

Successful Mining

Magic No.

Block Header

Time

Target

nBits

Nonce

Txns

raw

transactions

Block (n+1)

Block Size

Previous

Block

Header

Hash

Merkle

Root

Hash

txn_count

Hash of Block Header

100 of 105

Distributed Mined Blockchain to Peer

A

B

C

D

E

F

G

H

Block (n+1)

Successful Mining

101 of 105

Arriving at Consensus�

  • Although the accepted chain can be considered a list, the block chain is best represented with a tree.
  • The longest path represents the accepted chain.
  • A participant choosing to extend an existing path in the block chain indicates a vote towards consensus on that path. The longer the path, the more computation was expended building it.

101

102 of 105

Model Distribution (1) for Cloud Server

102

Peer-to-peer download

Centralized download

Cloud Server

Edge

Device

G

Edge

Device

H

Edge

Device

D

Edge

Device

E

Edge

Device

I

Edge

Device

A

Edge

Device

B

Edge

Device

F

Edge

Device

C

models

Cloud Server

Edge

Device

G

Edge

Device

H

Edge

Device

D

Edge

Device

E

Edge

Device

I

Edge

Device

A

Edge

Device

B

Edge

Device

F

Edge

Device

C

models

103 of 105

Model Distribution (2) for Edge Device

103

Cloud Server

Edge

Device

G

Edge

Device

H

Edge

Device

D

Edge

Device

E

Edge

Device

I

Edge

Device

A

Edge

Device

B

Edge

Device

F

Edge

Device

C

Incremental

models

Cloud Server

Edge

Device

G

Edge

Device

H

Edge

Device

D

Edge

Device

E

Edge

Device

I

Edge

Device

A

Edge

Device

B

Edge

Device

F

Edge

Device

C

Incremental

models

Peer-to-peer download

Centralized download

104 of 105

105 of 105

105

Backup Slides