1 of 27

JSONStream

ENHANCE

USER EXPERIENCE

Vihang Patel

Senior Developer - Frontend @BookMyShow

2 of 27

3 of 27

What is JSONStream ?

4 of 27

JSONStream

JSON stream is basically a continuous stream of

  • Individual
  • Meaningful
  • Parsable

JSON object chunk which is sent over the wire.

5 of 27

What problem it solves ?

6 of 27

  • Slow and coupled server responses

  • Big AJAX requests are slow and brittle

  • Multiple AJAX requests increases complexity

  • Memory efficiency when it comes to parse the JSON object

7 of 27

8 of 27

How it

Works ?

RESPONSE SHOULD BE CHUNKED

BY DEFAULT CHUNK SIZE IS 64K

BROWSER SUPPORT

9 of 27

Demo build in

  • Create React App

  • Express server

10 of 27

Comparison

With streaming : First render was quick,

Without streaming : It took ~1.2s for first render on high end

devices

11 of 27

BACKEND CHANGES

12 of 27

Response

  • 108 Venues

  • Raw response size 839KB

  • Each venue is of ~7KB on an average

13 of 27

Chunk

Chunk

Chunk

Chunk

Chunk

Chunk

Chunk

Chunk

Chunk

Chunk

Chunk

Chunk

Chunk

Step 1

JSON response should be divided in meaningful chunk

14 of 27

let stringData = []

for (let c = 0; c < data.length; c++) {

const temp = JSON.stringify(data[c])

const arr = chunkify(temp)

stringData = [...stringData, ...arr]

}

function chunkify(data) {

const dataArr = []

if (data.length > SIZE) {

dataArr.push(data.substr(0, SIZE))

return [...dataArr, ...chunkify(data.substr(SIZE, data.length))]

} else {

return [data]

}

}

15 of 27

50K

10K

25K

30K

64K

3K

20K

25K

43K

12K

55K

23K

Chunk

Chunk

Chunk

Chunk

Chunk

Chunk

Chunk

Chunk

Chunk

Chunk

Chunk

16 of 27

app.get("/stream-api", function(req, res) {

for (let c = 0; c < chunkedData.length; c++) {

setTimeout(() => {

res.write(chunkedData[c])

res.flush()

if (c == chunkedData.length - 1) {

res.end()

}

}, c * 50)

}

})

Step 2

Each venue will become chunk and will be streamed over the wire

17 of 27

CLIENT CHANGES

18 of 27

Client Changes

  • Browser should be able to interpret chunks

  • Almost all browsers support “transfer-encoding:chunked” response header

  • Interpret chunk, decode and parse it in JSON Object

19 of 27

fetch(urlToLoad).then(response => {

const reader = response.body.getReader()

}

20 of 27

fetch(urlToLoad).then(response => {

const reader = response.body.getReader()

const stream = new ReadableStream({�

� })

}

21 of 27

fetch(urlToLoad).then(response => {

const reader = response.body.getReader()

const stream = new ReadableStream({� // read untill last chunk is encountered

start(controller) {

} � })

}

22 of 27

fetch(urlToLoad).then(response => {

const reader = response.body.getReader()

const stream = new ReadableStream({� // read untill last chunk is encountered

start(controller) {

function push() {

// "done" is a Boolean and value a "Uint8Array"

}

} � })

}

23 of 27

fetch(urlToLoad).then(response => {

const reader = response.body.getReader()

const stream = new ReadableStream({� // read untill last chunk is encountered

start(controller) {

function push() {

// "done" is a Boolean and value a "Uint8Array"

return reader.read().then(({done, value}) => {

// "done" is a Boolean and value a "Uint8Array"})

.then(({done}) => {

if (!done) {

push()

}

})

}

}

} � })

}

24 of 27

CHALLENGES

25 of 27

Challenges

  • Extra one time processing is incurred to create meaningful, parsable chunks

  • Support on multiple browsers

  • If API endpoint is used by multiple platforms

26 of 27

Libraries

  • Oboe.js

  • Highland.js

27 of 27

Thank you