SlideShare a Scribd company logo
1 of 134
Download to read offline
Luciano Mammino (@loige)
IT’S ABOUT TIME TOIT’S ABOUT TIME TO
EMBRACE STREAMSEMBRACE STREAMS
  
loige.link/streams-cityjs
London - May 3, 2019
1
// buffer-copy.js
const {
readFileSync,
writeFileSync
} = require('fs')
const [,, src, dest] = process.argv
// read entire file content
const content = readFileSync(src)
// write that content somewhere else
writeFileSync(dest, content)
@loige2
// buffer-copy.js
const {
readFileSync,
writeFileSync
} = require('fs')
const [,, src, dest] = process.argv
// read entire file content
const content = readFileSync(src)
// write that content somewhere else
writeFileSync(dest, content)
@loige2
// buffer-copy.js
const {
readFileSync,
writeFileSync
} = require('fs')
const [,, src, dest] = process.argv
// read entire file content
const content = readFileSync(src)
// write that content somewhere else
writeFileSync(dest, content)
@loige2
// buffer-copy.js
const {
readFileSync,
writeFileSync
} = require('fs')
const [,, src, dest] = process.argv
// read entire file content
const content = readFileSync(src)
// write that content somewhere else
writeFileSync(dest, content)
@loige2
@loige3
WE DO THIS ALL THE TIMEWE DO THIS ALL THE TIME
@loige3
WE DO THIS ALL THE TIMEWE DO THIS ALL THE TIME
AND IT'S OKAND IT'S OK
@loige3
WE DO THIS ALL THE TIMEWE DO THIS ALL THE TIME
AND IT'S OKAND IT'S OK
BUT SOMETIMES ...BUT SOMETIMES ...
@loige3
@loige
 ERR_FS_FILE_TOO_LARGE!  ERR_FS_FILE_TOO_LARGE! 
File size is greater than possible Buffer
4
BUT WHY?BUT WHY?
@loige5
IF BYTES IF BYTES WEREWERE BLOCKS... BLOCKS...@loige
6
MARIO CAN LIFTMARIO CAN LIFT
FEW BLOCKSFEW BLOCKS
@loige
7
BUT NOT TOO MANY...BUT NOT TOO MANY...@loige
?!
8
WHAT CAN WE DO IF WE HAVE TOWHAT CAN WE DO IF WE HAVE TO
MOVE MANY BLOCKS?MOVE MANY BLOCKS?
@loige9
WE CAN MOVE THEM ONE BY ONE!WE CAN MOVE THEM ONE BY ONE!
@loige
we stream them...
10
11
 HELLO, I AM LUCIANO! HELLO, I AM LUCIANO!
11
 HELLO, I AM LUCIANO! HELLO, I AM LUCIANO!
11
 HELLO, I AM LUCIANO! HELLO, I AM LUCIANO!
11
 HELLO, I AM LUCIANO! HELLO, I AM LUCIANO!
11
 HELLO, I AM LUCIANO! HELLO, I AM LUCIANO!
Cloud Architect
11
 HELLO, I AM LUCIANO! HELLO, I AM LUCIANO!
Cloud Architect
Blog: 
Twitter: 
GitHub:   
loige.co
@loige
@lmammino
11
 HELLO, I AM LUCIANO! HELLO, I AM LUCIANO!
Cloud Architect
Blog: 
Twitter: 
GitHub:   
loige.co
@loige
@lmammino
11
code: loige.link/streams-examples
loige.link/streams-cityjs
12
01. BUFFERS VS01. BUFFERS VS  
        STREAMS        STREAMS
@loige13
BUFFERBUFFER: DATA STRUCTURE TO STORE AND: DATA STRUCTURE TO STORE AND
TRANSFER ARBITRARY BINARY DATATRANSFER ARBITRARY BINARY DATA
@loige
*Note that this is loading all the content of the file in memory
*
14
STREAMSTREAM: ABSTRACT INTERFACE FOR: ABSTRACT INTERFACE FOR
WORKING WITH STREAMING DATAWORKING WITH STREAMING DATA
@loige
*It does not load all the data straight away
*
15
FILE COPY: FILE COPY: THE BUFFER WAYTHE BUFFER WAY
@loige
// buffer-copy.js
const {
readFileSync,
writeFileSync
} = require('fs')
const [,, src, dest] = process.argv
const content = readFileSync(src)
writeFileSync(dest, content)
16
FILE COPY: FILE COPY: THE STREAM WAYTHE STREAM WAY
// stream-copy.js
const {
createReadStream,
createWriteStream
} = require('fs')
const [,, src, dest] = process.argv
const srcStream = createReadStream(src)
const destStream = createWriteStream(dest)
srcStream.on('data', (data) => destStream.write(data))
@loige
* Careful: this implementation is not optimal
*
17
FILE COPY: FILE COPY: THE STREAM WAYTHE STREAM WAY
// stream-copy.js
const {
createReadStream,
createWriteStream
} = require('fs')
const [,, src, dest] = process.argv
const srcStream = createReadStream(src)
const destStream = createWriteStream(dest)
srcStream.on('data', (data) => destStream.write(data))
@loige
* Careful: this implementation is not optimal
*
17
FILE COPY: FILE COPY: THE STREAM WAYTHE STREAM WAY
// stream-copy.js
const {
createReadStream,
createWriteStream
} = require('fs')
const [,, src, dest] = process.argv
const srcStream = createReadStream(src)
const destStream = createWriteStream(dest)
srcStream.on('data', (data) => destStream.write(data))
@loige
* Careful: this implementation is not optimal
*
17
FILE COPY: FILE COPY: THE STREAM WAYTHE STREAM WAY
// stream-copy.js
const {
createReadStream,
createWriteStream
} = require('fs')
const [,, src, dest] = process.argv
const srcStream = createReadStream(src)
const destStream = createWriteStream(dest)
srcStream.on('data', (data) => destStream.write(data))
@loige
* Careful: this implementation is not optimal
*
17
MEMORY COMPARISON (~600MB FILE)MEMORY COMPARISON (~600MB FILE)
node ­­inspect­brk buffer­copy.js assets/poster.psd ~/Downloads/poster.psd
@loige18
MEMORY COMPARISON (~600MB FILE)MEMORY COMPARISON (~600MB FILE)
node ­­inspect­brk stream­copy.js assets/poster.psd ~/Downloads/poster.psd
@loige19
LET'S TRY WITH A BIG FILE (~10GB)LET'S TRY WITH A BIG FILE (~10GB)
node ­­inspect­brk stream­copy.js assets/the­matrix­hd.mkv ~/Downloads/the­matrix­hd.mkv
@loige20
 STREAMS VS BUFFERS  STREAMS VS BUFFERS 
Streams keep a low memory footprint
even with large amounts of data
Streams allows you to process data as
soon as it arrives
@loige21
03. STREAM TYPES03. STREAM TYPES  
       & APIS       & APIS
@loige22
ALL STREAMS ARE ALL STREAMS ARE EVENT EMITTERSEVENT EMITTERS
A stream instance is an object that emits events when its internal
state changes, for instance:
s.on('readable', () => {}) // ready to be consumed
s.on('data', (chunk) => {}) // new data is available
s.on('error', (err) => {}) // some error happened
s.on('end', () => {}) // no more data available
The events available depend from the type of stream
@loige23
READABLEREADABLE STREAMS STREAMS
A readable stream represents a source from which data is consumed.
Examples:
fs readStream
process.stdin
HTTP response (client-side)
HTTP request (server-side)
AWS S3 GetObject (data field)
It supports two modes for data consumption: flowing and paused (or non-
flowing) mode.
@loige24
READABLE STREAMSREADABLE STREAMS
Data is read from source automatically and chunks are emitted as soon
as they are available.
@loige25
@loige
1
2
3
Source data
Readable stream in
flowing mode
data listener
READABLE STREAMSREADABLE STREAMS
Data is read from source automatically and chunks are emitted as soon
as they are available.
26
@loige
12
3
Source data
Readable stream in
flowing mode
Read
data listener
READABLE STREAMSREADABLE STREAMS
Data is read from source automatically and chunks are emitted as soon
as they are available.
27
@loige
12
3
Source data
Readable stream in
flowing mode
data listener
data
READABLE STREAMSREADABLE STREAMS
Data is read from source automatically and chunks are emitted as soon
as they are available.
28
@loige
2
3
Source data
Readable stream in
flowing mode
data listener
Read
READABLE STREAMSREADABLE STREAMS
Data is read from source automatically and chunks are emitted as soon
as they are available.
29
@loige
2
3
Source data
Readable stream in
flowing mode
data listener
data
READABLE STREAMSREADABLE STREAMS
Data is read from source automatically and chunks are emitted as soon
as they are available.
30
@loige
3
Source data
Readable stream in
flowing mode
data listener
Read
READABLE STREAMSREADABLE STREAMS
Data is read from source automatically and chunks are emitted as soon
as they are available.
31
@loige
3
Source data
Readable stream in
flowing mode
data listener
data
READABLE STREAMSREADABLE STREAMS
Data is read from source automatically and chunks are emitted as soon
as they are available.
32
@loige
Source data
Readable stream in
flowing mode
Read
data listener
(end)
READABLE STREAMSREADABLE STREAMS
Data is read from source automatically and chunks are emitted as soon
as they are available.
33
@loige
Source data
Readable stream in
flowing mode
data listener
end
(end)
When no more data is available, end is emitted.
READABLE STREAMSREADABLE STREAMS
Data is read from source automatically and chunks are emitted as soon
as they are available.
34
// count-emojis-flowing.js
const { createReadStream } = require('fs')
const { EMOJI_MAP } = require('emoji') // from npm
const emojis = Object.keys(EMOJI_MAP)
const file = createReadStream(process.argv[2])
let counter = 0
file.on('data', chunk => {
for (let char of chunk.toString('utf8')) {
if (emojis.includes(char)) {
counter++
}
}
})
file.on('end', () => console.log(`Found ${counter} emojis`))
file.on('error', err => console.error(`Error reading file: ${err}`))
@loige35
// count-emojis-flowing.js
const { createReadStream } = require('fs')
const { EMOJI_MAP } = require('emoji') // from npm
const emojis = Object.keys(EMOJI_MAP)
const file = createReadStream(process.argv[2])
let counter = 0
file.on('data', chunk => {
for (let char of chunk.toString('utf8')) {
if (emojis.includes(char)) {
counter++
}
}
})
file.on('end', () => console.log(`Found ${counter} emojis`))
file.on('error', err => console.error(`Error reading file: ${err}`))
@loige35
loige.link/st_patrick
@loige36
WRITABLEWRITABLE STREAMS STREAMS
A writable stream is an abstraction that allows to write data over a destination
 
Examples:
fs writeStream
process.stdout, process.stderr
HTTP request (client-side)
HTTP response (server-side)
AWS S3 PutObject (body parameter)
@loige37
// writable-http-request.js
const http = require('http')
const req = http.request(
{
hostname: 'enx6b07hdu6cs.x.pipedream.net',
method: 'POST'
},
resp => {
console.log(`Server responded with "${resp.statusCode}"`)
}
)
req.on('finish', () => console.log('request sent'))
req.on('close', () => console.log('Connection closed'))
req.on('error', err => console.error(`Request failed: ${err}`))
req.write('writing some content...n')
req.end('last write & close the stream')
@loige38
// writable-http-request.js
const http = require('http')
const req = http.request(
{
hostname: 'enx6b07hdu6cs.x.pipedream.net',
method: 'POST'
},
resp => {
console.log(`Server responded with "${resp.statusCode}"`)
}
)
req.on('finish', () => console.log('request sent'))
req.on('close', () => console.log('Connection closed'))
req.on('error', err => console.error(`Request failed: ${err}`))
req.write('writing some content...n')
req.end('last write & close the stream')
@loige38
// writable-http-request.js
const http = require('http')
const req = http.request(
{
hostname: 'enx6b07hdu6cs.x.pipedream.net',
method: 'POST'
},
resp => {
console.log(`Server responded with "${resp.statusCode}"`)
}
)
req.on('finish', () => console.log('request sent'))
req.on('close', () => console.log('Connection closed'))
req.on('error', err => console.error(`Request failed: ${err}`))
req.write('writing some content...n')
req.end('last write & close the stream')
@loige38
// writable-http-request.js
const http = require('http')
const req = http.request(
{
hostname: 'enx6b07hdu6cs.x.pipedream.net',
method: 'POST'
},
resp => {
console.log(`Server responded with "${resp.statusCode}"`)
}
)
req.on('finish', () => console.log('request sent'))
req.on('close', () => console.log('Connection closed'))
req.on('error', err => console.error(`Request failed: ${err}`))
req.write('writing some content...n')
req.end('last write & close the stream')
@loige38
@loige39
loige.link/writable-http-req
@loige40
BACKPRESSUREBACKPRESSURE
When writing large amounts of data you
should make sure you handle the stop write
signal and the drain event
 
loige.link/backpressure
@loige41
// stream-copy-safe.js
const { createReadStream, createWriteStream } = require('fs')
const [, , src, dest] = process.argv
const srcStream = createReadStream(src)
const destStream = createWriteStream(dest)
srcStream.on('data', data => {
const canContinue = destStream.write(data)
if (!canContinue) {
// we are overflowing the destination, we should pause
srcStream.pause()
// we will resume when the destination stream is drained
destStream.once('drain', () => srcStream.resume())
}
})
@loige42
// stream-copy-safe.js
const { createReadStream, createWriteStream } = require('fs')
const [, , src, dest] = process.argv
const srcStream = createReadStream(src)
const destStream = createWriteStream(dest)
srcStream.on('data', data => {
const canContinue = destStream.write(data)
if (!canContinue) {
// we are overflowing the destination, we should pause
srcStream.pause()
// we will resume when the destination stream is drained
destStream.once('drain', () => srcStream.resume())
}
})
@loige42
// stream-copy-safe.js
const { createReadStream, createWriteStream } = require('fs')
const [, , src, dest] = process.argv
const srcStream = createReadStream(src)
const destStream = createWriteStream(dest)
srcStream.on('data', data => {
const canContinue = destStream.write(data)
if (!canContinue) {
// we are overflowing the destination, we should pause
srcStream.pause()
// we will resume when the destination stream is drained
destStream.once('drain', () => srcStream.resume())
}
})
@loige42
// stream-copy-safe.js
const { createReadStream, createWriteStream } = require('fs')
const [, , src, dest] = process.argv
const srcStream = createReadStream(src)
const destStream = createWriteStream(dest)
srcStream.on('data', data => {
const canContinue = destStream.write(data)
if (!canContinue) {
// we are overflowing the destination, we should pause
srcStream.pause()
// we will resume when the destination stream is drained
destStream.once('drain', () => srcStream.resume())
}
})
@loige42
// stream-copy-safe.js
const { createReadStream, createWriteStream } = require('fs')
const [, , src, dest] = process.argv
const srcStream = createReadStream(src)
const destStream = createWriteStream(dest)
srcStream.on('data', data => {
const canContinue = destStream.write(data)
if (!canContinue) {
// we are overflowing the destination, we should pause
srcStream.pause()
// we will resume when the destination stream is drained
destStream.once('drain', () => srcStream.resume())
}
})
@loige42
OTHER TYPES OF STREAMOTHER TYPES OF STREAM
Duplex Stream 
streams that are both Readable and Writable.  
(net.Socket) 
 
Transform Stream 
Duplex streams that can modify or transform the data as it is written
and read. 
(zlib.createGzip(), crypto.createCipheriv())
@loige43
ANATOMY OF A TRANSFORM STREAMANATOMY OF A TRANSFORM STREAM
1. write data
transform stream
3. read transformed data2. transform the data
(readable stream) (writable stream)
@loige44
GZIP EXAMPLEGZIP EXAMPLE
1. write data
transform stream
3. read transformed data2. transform the data
(readable stream) (writable stream)
@loige
Uncompressed data Compressed data
compress
zlib.createGzip()
45
gzipStream.on('data', data => {
const canContinue = destStream.write(data)
if (!canContinue) {
gzipStream.pause()
destStream.once('drain', () => {
gzipStream.resume()
})
}
})
gzipStream.on('end', () => {
destStream.end()
})
// ⚠ TODO: handle errors!
// stream-copy-gzip.js
const {
createReadStream,
createWriteStream
} = require('fs')
const { createGzip } = require('zlib')
const [, , src, dest] = process.argv
const srcStream = createReadStream(src)
const gzipStream = createGzip()
const destStream = createWriteStream(dest)
srcStream.on('data', data => {
const canContinue = gzipStream.write(data)
if (!canContinue) {
srcStream.pause()
gzipStream.once('drain', () => {
srcStream.resume()
})
}
})
srcStream.on('end', () => {
// check if there's buffered data left
const remainingData = gzipStream.read()
if (remainingData !== null) {
destStream.write()
}
gzipStream.end()
})
@loige46
gzipStream.on('data', data => {
const canContinue = destStream.write(data)
if (!canContinue) {
gzipStream.pause()
destStream.once('drain', () => {
gzipStream.resume()
})
}
})
gzipStream.on('end', () => {
destStream.end()
})
// ⚠ TODO: handle errors!
// stream-copy-gzip.js
const {
createReadStream,
createWriteStream
} = require('fs')
const { createGzip } = require('zlib')
const [, , src, dest] = process.argv
const srcStream = createReadStream(src)
const gzipStream = createGzip()
const destStream = createWriteStream(dest)
srcStream.on('data', data => {
const canContinue = gzipStream.write(data)
if (!canContinue) {
srcStream.pause()
gzipStream.once('drain', () => {
srcStream.resume()
})
}
})
srcStream.on('end', () => {
// check if there's buffered data left
const remainingData = gzipStream.read()
if (remainingData !== null) {
destStream.write()
}
gzipStream.end()
})
@loige46
03. PIPE()03. PIPE()
@loige47
readable
.pipe(tranform1)
.pipe(transform2)
.pipe(transform3)
.pipe(writable)
readable.pipe(writableDest)
@loige
Connects a readable stream to a writable stream
A transform stream can be used as a destination as well
It returns the destination stream allowing for a chain of pipes
48
// stream-copy-gzip-pipe.js
const {
createReadStream,
createWriteStream
} = require('fs')
const { createGzip } = require('zlib')
const [, , src, dest] = process.argv
const srcStream = createReadStream(src)
const gzipStream = createGzip()
const destStream = createWriteStream(dest)
srcStream
.pipe(gzipStream)
.pipe(destStream)
@loige49
// stream-copy-gzip-pipe.js
const {
createReadStream,
createWriteStream
} = require('fs')
const { createGzip } = require('zlib')
const [, , src, dest] = process.argv
const srcStream = createReadStream(src)
const gzipStream = createGzip()
const destStream = createWriteStream(dest)
srcStream
.pipe(gzipStream)
.pipe(destStream)
@loige49
// stream-copy-gzip-pipe.js
const {
createReadStream,
createWriteStream
} = require('fs')
const { createGzip } = require('zlib')
const [, , src, dest] = process.argv
const srcStream = createReadStream(src)
const gzipStream = createGzip()
const destStream = createWriteStream(dest)
srcStream
.pipe(gzipStream)
.pipe(destStream)
@loige49
readable
.pipe(decompress)
.pipe(decrypt)
.pipe(convert)
.pipe(encrypt)
.pipe(compress)
.pipe(writeToDisk)
Setup complex pipelines with pipe
@loige
This is the most common way to use streams
50
readable
.on('error', handleErr)
.pipe(decompress)
.on('error', handleErr)
.pipe(decrypt)
.on('error', handleErr)
.pipe(convert)
.on('error', handleErr)
.pipe(encrypt)
.on('error', handleErr)
.pipe(compress)
.on('error', handleErr)
.pipe(writeToDisk)
.on('error', handleErr)
Handling errors (correctly)
@loige51
readable
.on('error', handleErr)
.pipe(decompress)
.on('error', handleErr)
.pipe(decrypt)
.on('error', handleErr)
.pipe(convert)
.on('error', handleErr)
.pipe(encrypt)
.on('error', handleErr)
.pipe(compress)
.on('error', handleErr)
.pipe(writeToDisk)
.on('error', handleErr)
Handling errors (correctly)
@loige
 
handleErr should end and destroy the streams
(it doesn't happen automatically)
 
51
readable
.on('error', handleErr)
.pipe(decompress)
.on('error', handleErr)
.pipe(decrypt)
.on('error', handleErr)
.pipe(convert)
.on('error', handleErr)
.pipe(encrypt)
.on('error', handleErr)
.pipe(compress)
.on('error', handleErr)
.pipe(writeToDisk)
.on('error', handleErr)
Handling errors (correctly)
@loige
 
handleErr should end and destroy the streams
(it doesn't happen automatically)
 
51
04. STREAM UTILITIES04. STREAM UTILITIES
@loige52
// stream-copy-gzip-pipeline.js
const { pipeline } = require('stream')
const { createReadStream, createWriteStream } = require('fs')
const { createGzip } = require('zlib')
const [, , src, dest] = process.argv
pipeline(
createReadStream(src),
createGzip(),
createWriteStream(dest),
function onEnd (err) {
if (err) {
console.error(`Error: ${err}`)
process.exit(1)
}
console.log('Done!')
}
)
stream.pipeline(...streams, callback)
@loige53
// stream-copy-gzip-pipeline.js
const { pipeline } = require('stream')
const { createReadStream, createWriteStream } = require('fs')
const { createGzip } = require('zlib')
const [, , src, dest] = process.argv
pipeline(
createReadStream(src),
createGzip(),
createWriteStream(dest),
function onEnd (err) {
if (err) {
console.error(`Error: ${err}`)
process.exit(1)
}
console.log('Done!')
}
)
stream.pipeline(...streams, callback)
@loige
Can pass multiple streams (they will be piped)
53
// stream-copy-gzip-pipeline.js
const { pipeline } = require('stream')
const { createReadStream, createWriteStream } = require('fs')
const { createGzip } = require('zlib')
const [, , src, dest] = process.argv
pipeline(
createReadStream(src),
createGzip(),
createWriteStream(dest),
function onEnd (err) {
if (err) {
console.error(`Error: ${err}`)
process.exit(1)
}
console.log('Done!')
}
)
stream.pipeline(...streams, callback)
@loige
Can pass multiple streams (they will be piped)
The last argument is a callback. If invoked with an
error, it means the pipeline failed at some point.
All the streams are ended and destroyed correctly.
53
readable-stream - 
Npm package that contains the latest version of Node.js stream library.
It also makes Node.js streams compatible with the browser (can be used with
Webpack and Broswserify)
npm.im/readable-stream
@loige
* yeah, the name is misleading. The package offers all the functionalities in the official 'stream'
package, not just readable streams.
*
54
04. WRITING CUSTOM   04. WRITING CUSTOM   
        STREAMS        STREAMS
@loige55
@loige
EmojiStream Uppercasify DOMAppend
56
@loige
EmojiStream Uppercasify DOMAppend
 Lemon
56
@loige
EmojiStream Uppercasify DOMAppend
 LEMON
56
@loige
EmojiStream Uppercasify DOMAppend
 LEMON
56
@loige
EmojiStream Uppercasify DOMAppend
 LEMON Banana
56
@loige
EmojiStream Uppercasify DOMAppend
 LEMON BANANA
56
@loige
EmojiStream Uppercasify DOMAppend
 LEMON
 BANANA
56
@loige
EmojiStream Uppercasify DOMAppend
 LEMON
class EmojiStream
extends Readable {
_read() {
// ...
}
}
 BANANA
56
@loige
EmojiStream Uppercasify DOMAppend
 LEMON
class EmojiStream
extends Readable {
_read() {
// ...
}
}
class Uppercasify
extends Transform {
_transform(
chunk,
enc,
done
) {
// ...
}
}
 BANANA
56
@loige
EmojiStream Uppercasify DOMAppend
 LEMON
class EmojiStream
extends Readable {
_read() {
// ...
}
}
class Uppercasify
extends Transform {
_transform(
chunk,
enc,
done
) {
// ...
}
}
class DOMAppend
extends Writable {
_write(
chunk,
enc,
done
) {
// ...
}
}
 BANANA
56
@loige
EmojiStream Uppercasify DOMAppend
 LEMON
class EmojiStream
extends Readable {
_read() {
// ...
}
}
class Uppercasify
extends Transform {
_transform(
chunk,
enc,
done
) {
// ...
}
}
class DOMAppend
extends Writable {
_write(
chunk,
enc,
done
) {
// ...
}
}
 BANANA
this.push(data)
pass data to the next step
56
// emoji-stream.js (custom readable stream)
const { EMOJI_MAP } = require('emoji') // from npm
const { Readable } = require('readable-stream') // from npm
const emojis = Object.keys(EMOJI_MAP)
function getEmojiDescription (index) {
return EMOJI_MAP[emojis[index]][1]
}
function getMessage (index) {
return emojis[index] + ' ' + getEmojiDescription(index)
}
class EmojiStream extends Readable {
constructor (options) {
super(options)
this._index = 0
}
_read () {
if (this._index >= emojis.length) {
return this.push(null)
}
return this.push(getMessage(this._index++))
}
}
module.exports = EmojiStream
@loige57
// emoji-stream.js (custom readable stream)
const { EMOJI_MAP } = require('emoji') // from npm
const { Readable } = require('readable-stream') // from npm
const emojis = Object.keys(EMOJI_MAP)
function getEmojiDescription (index) {
return EMOJI_MAP[emojis[index]][1]
}
function getMessage (index) {
return emojis[index] + ' ' + getEmojiDescription(index)
}
class EmojiStream extends Readable {
constructor (options) {
super(options)
this._index = 0
}
_read () {
if (this._index >= emojis.length) {
return this.push(null)
}
return this.push(getMessage(this._index++))
}
}
module.exports = EmojiStream
@loige57
// emoji-stream.js (custom readable stream)
const { EMOJI_MAP } = require('emoji') // from npm
const { Readable } = require('readable-stream') // from npm
const emojis = Object.keys(EMOJI_MAP)
function getEmojiDescription (index) {
return EMOJI_MAP[emojis[index]][1]
}
function getMessage (index) {
return emojis[index] + ' ' + getEmojiDescription(index)
}
class EmojiStream extends Readable {
constructor (options) {
super(options)
this._index = 0
}
_read () {
if (this._index >= emojis.length) {
return this.push(null)
}
return this.push(getMessage(this._index++))
}
}
module.exports = EmojiStream
@loige57
// emoji-stream.js (custom readable stream)
const { EMOJI_MAP } = require('emoji') // from npm
const { Readable } = require('readable-stream') // from npm
const emojis = Object.keys(EMOJI_MAP)
function getEmojiDescription (index) {
return EMOJI_MAP[emojis[index]][1]
}
function getMessage (index) {
return emojis[index] + ' ' + getEmojiDescription(index)
}
class EmojiStream extends Readable {
constructor (options) {
super(options)
this._index = 0
}
_read () {
if (this._index >= emojis.length) {
return this.push(null)
}
return this.push(getMessage(this._index++))
}
}
module.exports = EmojiStream
@loige57
// emoji-stream.js (custom readable stream)
const { EMOJI_MAP } = require('emoji') // from npm
const { Readable } = require('readable-stream') // from npm
const emojis = Object.keys(EMOJI_MAP)
function getEmojiDescription (index) {
return EMOJI_MAP[emojis[index]][1]
}
function getMessage (index) {
return emojis[index] + ' ' + getEmojiDescription(index)
}
class EmojiStream extends Readable {
constructor (options) {
super(options)
this._index = 0
}
_read () {
if (this._index >= emojis.length) {
return this.push(null)
}
return this.push(getMessage(this._index++))
}
}
module.exports = EmojiStream
@loige57
// emoji-stream.js (custom readable stream)
const { EMOJI_MAP } = require('emoji') // from npm
const { Readable } = require('readable-stream') // from npm
const emojis = Object.keys(EMOJI_MAP)
function getEmojiDescription (index) {
return EMOJI_MAP[emojis[index]][1]
}
function getMessage (index) {
return emojis[index] + ' ' + getEmojiDescription(index)
}
class EmojiStream extends Readable {
constructor (options) {
super(options)
this._index = 0
}
_read () {
if (this._index >= emojis.length) {
return this.push(null)
}
return this.push(getMessage(this._index++))
}
}
module.exports = EmojiStream
@loige57
// uppercasify.js (custom transform stream)
const { Transform } = require('readable-stream')
class Uppercasify extends Transform {
_transform (chunk, encoding, done) {
this.push(chunk.toString().toUpperCase())
done()
}
}
module.exports = Uppercasify
@loige58
// uppercasify.js (custom transform stream)
const { Transform } = require('readable-stream')
class Uppercasify extends Transform {
_transform (chunk, encoding, done) {
this.push(chunk.toString().toUpperCase())
done()
}
}
module.exports = Uppercasify
@loige58
// uppercasify.js (custom transform stream)
const { Transform } = require('readable-stream')
class Uppercasify extends Transform {
_transform (chunk, encoding, done) {
this.push(chunk.toString().toUpperCase())
done()
}
}
module.exports = Uppercasify
@loige58
// uppercasify.js (custom transform stream)
const { Transform } = require('readable-stream')
class Uppercasify extends Transform {
_transform (chunk, encoding, done) {
this.push(chunk.toString().toUpperCase())
done()
}
}
module.exports = Uppercasify
@loige58
// uppercasify.js (custom transform stream)
const { Transform } = require('readable-stream')
class Uppercasify extends Transform {
_transform (chunk, encoding, done) {
this.push(chunk.toString().toUpperCase())
done()
}
}
module.exports = Uppercasify
@loige58
// uppercasify.js (custom transform stream)
const { Transform } = require('readable-stream')
class Uppercasify extends Transform {
_transform (chunk, encoding, done) {
this.push(chunk.toString().toUpperCase())
done()
}
}
module.exports = Uppercasify
@loige58
// dom-append.js (custom writable stream)
const { Writable } = require('readable-stream')
class DOMAppend extends Writable {
_write (chunk, encoding, done) {
const elem = document.createElement('li')
const content = document.createTextNode(chunk.toString())
elem.appendChild(content)
document.getElementById('list').appendChild(elem)
done()
}
}
module.exports = DOMAppend
@loige59
// dom-append.js (custom writable stream)
const { Writable } = require('readable-stream')
class DOMAppend extends Writable {
_write (chunk, encoding, done) {
const elem = document.createElement('li')
const content = document.createTextNode(chunk.toString())
elem.appendChild(content)
document.getElementById('list').appendChild(elem)
done()
}
}
module.exports = DOMAppend
@loige59
// dom-append.js (custom writable stream)
const { Writable } = require('readable-stream')
class DOMAppend extends Writable {
_write (chunk, encoding, done) {
const elem = document.createElement('li')
const content = document.createTextNode(chunk.toString())
elem.appendChild(content)
document.getElementById('list').appendChild(elem)
done()
}
}
module.exports = DOMAppend
@loige59
// dom-append.js (custom writable stream)
const { Writable } = require('readable-stream')
class DOMAppend extends Writable {
_write (chunk, encoding, done) {
const elem = document.createElement('li')
const content = document.createTextNode(chunk.toString())
elem.appendChild(content)
document.getElementById('list').appendChild(elem)
done()
}
}
module.exports = DOMAppend
@loige59
// dom-append.js (custom writable stream)
const { Writable } = require('readable-stream')
class DOMAppend extends Writable {
_write (chunk, encoding, done) {
const elem = document.createElement('li')
const content = document.createTextNode(chunk.toString())
elem.appendChild(content)
document.getElementById('list').appendChild(elem)
done()
}
}
module.exports = DOMAppend
@loige59
// dom-append.js (custom writable stream)
const { Writable } = require('readable-stream')
class DOMAppend extends Writable {
_write (chunk, encoding, done) {
const elem = document.createElement('li')
const content = document.createTextNode(chunk.toString())
elem.appendChild(content)
document.getElementById('list').appendChild(elem)
done()
}
}
module.exports = DOMAppend
@loige59
05. STREAMS IN THE     05. STREAMS IN THE     
        BROWSER        BROWSER
@loige60
// browser/app.js
const EmojiStream = require('../emoji-stream')
const Uppercasify = require('../uppercasify')
const DOMAppend = require('../dom-append')
const emoji = new EmojiStream()
const uppercasify = new Uppercasify()
const append = new DOMAppend()
emoji
.pipe(uppercasify)
.pipe(append)
@loige61
// browser/app.js
const EmojiStream = require('../emoji-stream')
const Uppercasify = require('../uppercasify')
const DOMAppend = require('../dom-append')
const emoji = new EmojiStream()
const uppercasify = new Uppercasify()
const append = new DOMAppend()
emoji
.pipe(uppercasify)
.pipe(append)
@loige61
// browser/app.js
const EmojiStream = require('../emoji-stream')
const Uppercasify = require('../uppercasify')
const DOMAppend = require('../dom-append')
const emoji = new EmojiStream()
const uppercasify = new Uppercasify()
const append = new DOMAppend()
emoji
.pipe(uppercasify)
.pipe(append)
@loige61
// browser/app.js
const EmojiStream = require('../emoji-stream')
const Uppercasify = require('../uppercasify')
const DOMAppend = require('../dom-append')
const emoji = new EmojiStream()
const uppercasify = new Uppercasify()
const append = new DOMAppend()
emoji
.pipe(uppercasify)
.pipe(append)
@loige61
npm i --save-dev webpack webpack-cli
node_modules/.bin/webpack src/browser/app.js
# creates dist/main.js
mv dist/main.js src/browser/app-bundle.js
@loige
Let's use webpack to build this app for the browser
62
<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="utf-8" />
<meta
name="viewport"
content="width=device-width, initial-scale=1, shrink-to-fit=no"
/>
<title>Streams in the browser!</title>
</head>
<body>
<ul id="list"></ul>
<script src="app.bundle.js"></script>
</body>
</html> @loige
Finally let's create an index.html
63
<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="utf-8" />
<meta
name="viewport"
content="width=device-width, initial-scale=1, shrink-to-fit=no"
/>
<title>Streams in the browser!</title>
</head>
<body>
<ul id="list"></ul>
<script src="app.bundle.js"></script>
</body>
</html> @loige
Finally let's create an index.html
63
<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="utf-8" />
<meta
name="viewport"
content="width=device-width, initial-scale=1, shrink-to-fit=no"
/>
<title>Streams in the browser!</title>
</head>
<body>
<ul id="list"></ul>
<script src="app.bundle.js"></script>
</body>
</html> @loige
Finally let's create an index.html
63
@loige64
@loige64
06. CLOSING06. CLOSING
@loige65
Streams have low memory footprint
Process data as soon as it's available
Composition through pipelines
Streams are abstractions:
Readable = Input
Transform = Business Logic
Writable = Output
@loige
TLDR;TLDR;
66
IF YOU WANT TO LEARN (EVEN) MOAR IF YOU WANT TO LEARN (EVEN) MOAR 
ABOUT STREAMS...ABOUT STREAMS...
nodejs.org/api/stream.html
github.com/substack/stream-handbook
@loige67
IF YOU ARE NOT CONVINCED YET...IF YOU ARE NOT CONVINCED YET...
@loige
curl parrot.live
68
@loige
github.com/hugomd/parrot.live
Check out the codebase
69
@loige70
@loige
THANKS!THANKS!
loige.link/streams-cityjs
We are hiring, talk to me! :)
71
CREDITSCREDITS
 on   for the cover picture
 for the amazing St. Patrick emoji art
The internet for the memes! :D
Dan Roizer Unsplash
emojiart.org
SPECIAL THANKSSPECIAL THANKS
,  ,  ,
,  , 
@StefanoAbalsamo @mariocasciaro @machine_person
@Podgeypoos79 @katavic_d @UrsoLuca
@loige72

More Related Content

What's hot

It’s about time to embrace Streams (Node Ukraine 2019)
 It’s about time to embrace Streams (Node Ukraine 2019) It’s about time to embrace Streams (Node Ukraine 2019)
It’s about time to embrace Streams (Node Ukraine 2019)Luciano Mammino
 
"It’s about time to embrace Streams" Luciano Mammino
"It’s about time to embrace Streams" Luciano Mammino"It’s about time to embrace Streams" Luciano Mammino
"It’s about time to embrace Streams" Luciano MamminoJulia Cherniak
 
Eat my data
Eat my dataEat my data
Eat my dataPeng Zuo
 
GIT: Content-addressable filesystem and Version Control System
GIT: Content-addressable filesystem and Version Control SystemGIT: Content-addressable filesystem and Version Control System
GIT: Content-addressable filesystem and Version Control SystemTommaso Visconti
 
mapserver_install_linux
mapserver_install_linuxmapserver_install_linux
mapserver_install_linuxtutorialsruby
 
Redis as a message queue
Redis as a message queueRedis as a message queue
Redis as a message queueBrandon Lamb
 
Clojure + MongoDB on Heroku
Clojure + MongoDB on HerokuClojure + MongoDB on Heroku
Clojure + MongoDB on HerokuNaoyuki Kakuda
 
Fluentd unified logging layer
Fluentd   unified logging layerFluentd   unified logging layer
Fluentd unified logging layerKiyoto Tamura
 
Devinsampa nginx-scripting
Devinsampa nginx-scriptingDevinsampa nginx-scripting
Devinsampa nginx-scriptingTony Fabeen
 
GlusterFS As an Object Storage
GlusterFS As an Object StorageGlusterFS As an Object Storage
GlusterFS As an Object StorageKeisuke Takahashi
 
Centralized + Unified Logging
Centralized + Unified LoggingCentralized + Unified Logging
Centralized + Unified LoggingGabor Kozma
 
Building a DSL with GraalVM (VoxxedDays Luxembourg)
Building a DSL with GraalVM (VoxxedDays Luxembourg)Building a DSL with GraalVM (VoxxedDays Luxembourg)
Building a DSL with GraalVM (VoxxedDays Luxembourg)Maarten Mulders
 
REST Web Sebvice
REST Web SebviceREST Web Sebvice
REST Web Sebvicekhmerforge
 
Redis & ZeroMQ: How to scale your application
Redis & ZeroMQ: How to scale your applicationRedis & ZeroMQ: How to scale your application
Redis & ZeroMQ: How to scale your applicationrjsmelo
 
2005_Structures and functions of Makefile
2005_Structures and functions of Makefile2005_Structures and functions of Makefile
2005_Structures and functions of MakefileNakCheon Jung
 

What's hot (18)

It’s about time to embrace Streams (Node Ukraine 2019)
 It’s about time to embrace Streams (Node Ukraine 2019) It’s about time to embrace Streams (Node Ukraine 2019)
It’s about time to embrace Streams (Node Ukraine 2019)
 
"It’s about time to embrace Streams" Luciano Mammino
"It’s about time to embrace Streams" Luciano Mammino"It’s about time to embrace Streams" Luciano Mammino
"It’s about time to embrace Streams" Luciano Mammino
 
Tutorial on Node File System
Tutorial on Node File SystemTutorial on Node File System
Tutorial on Node File System
 
Fluentd and WebHDFS
Fluentd and WebHDFSFluentd and WebHDFS
Fluentd and WebHDFS
 
Eat my data
Eat my dataEat my data
Eat my data
 
GIT: Content-addressable filesystem and Version Control System
GIT: Content-addressable filesystem and Version Control SystemGIT: Content-addressable filesystem and Version Control System
GIT: Content-addressable filesystem and Version Control System
 
mapserver_install_linux
mapserver_install_linuxmapserver_install_linux
mapserver_install_linux
 
Redis as a message queue
Redis as a message queueRedis as a message queue
Redis as a message queue
 
Clojure + MongoDB on Heroku
Clojure + MongoDB on HerokuClojure + MongoDB on Heroku
Clojure + MongoDB on Heroku
 
Fluentd unified logging layer
Fluentd   unified logging layerFluentd   unified logging layer
Fluentd unified logging layer
 
Devinsampa nginx-scripting
Devinsampa nginx-scriptingDevinsampa nginx-scripting
Devinsampa nginx-scripting
 
GlusterFS As an Object Storage
GlusterFS As an Object StorageGlusterFS As an Object Storage
GlusterFS As an Object Storage
 
Joy of Unix
Joy of UnixJoy of Unix
Joy of Unix
 
Centralized + Unified Logging
Centralized + Unified LoggingCentralized + Unified Logging
Centralized + Unified Logging
 
Building a DSL with GraalVM (VoxxedDays Luxembourg)
Building a DSL with GraalVM (VoxxedDays Luxembourg)Building a DSL with GraalVM (VoxxedDays Luxembourg)
Building a DSL with GraalVM (VoxxedDays Luxembourg)
 
REST Web Sebvice
REST Web SebviceREST Web Sebvice
REST Web Sebvice
 
Redis & ZeroMQ: How to scale your application
Redis & ZeroMQ: How to scale your applicationRedis & ZeroMQ: How to scale your application
Redis & ZeroMQ: How to scale your application
 
2005_Structures and functions of Makefile
2005_Structures and functions of Makefile2005_Structures and functions of Makefile
2005_Structures and functions of Makefile
 

Similar to Luciano Mammino on embracing streams for large file processing

Similar to Luciano Mammino on embracing streams for large file processing (20)

It’s about time to embrace Node.js Streams
It’s about time to embrace Node.js StreamsIt’s about time to embrace Node.js Streams
It’s about time to embrace Node.js Streams
 
working with files
working with filesworking with files
working with files
 
Downloading a Billion Files in Python
Downloading a Billion Files in PythonDownloading a Billion Files in Python
Downloading a Billion Files in Python
 
File handling in C++
File handling in C++File handling in C++
File handling in C++
 
data file handling
data file handlingdata file handling
data file handling
 
Filehandling
FilehandlingFilehandling
Filehandling
 
File handling in cpp
File handling in cppFile handling in cpp
File handling in cpp
 
Chapter28 data-file-handling
Chapter28 data-file-handlingChapter28 data-file-handling
Chapter28 data-file-handling
 
File handling in_c
File handling in_cFile handling in_c
File handling in_c
 
Apache: Big Data - Starting with Apache Spark, Best Practices
Apache: Big Data - Starting with Apache Spark, Best PracticesApache: Big Data - Starting with Apache Spark, Best Practices
Apache: Big Data - Starting with Apache Spark, Best Practices
 
cpp-file-handling
cpp-file-handlingcpp-file-handling
cpp-file-handling
 
Cpp file-handling
Cpp file-handlingCpp file-handling
Cpp file-handling
 
CS215 - Lec 2 file organization
CS215 - Lec 2   file organizationCS215 - Lec 2   file organization
CS215 - Lec 2 file organization
 
Data file handling in c++
Data file handling in c++Data file handling in c++
Data file handling in c++
 
Java File I/O Performance Analysis - Part I - JCConf 2018
Java File I/O Performance Analysis - Part I - JCConf 2018Java File I/O Performance Analysis - Part I - JCConf 2018
Java File I/O Performance Analysis - Part I - JCConf 2018
 
File System.pptx
File System.pptxFile System.pptx
File System.pptx
 
File management in C++
File management in C++File management in C++
File management in C++
 
17 files and streams
17 files and streams17 files and streams
17 files and streams
 
file_handling_in_c.ppt
file_handling_in_c.pptfile_handling_in_c.ppt
file_handling_in_c.ppt
 
basics of file handling
basics of file handlingbasics of file handling
basics of file handling
 

More from Luciano Mammino

Did you know JavaScript has iterators? DublinJS
Did you know JavaScript has iterators? DublinJSDid you know JavaScript has iterators? DublinJS
Did you know JavaScript has iterators? DublinJSLuciano Mammino
 
What I learned by solving 50 Advent of Code challenges in Rust - RustNation U...
What I learned by solving 50 Advent of Code challenges in Rust - RustNation U...What I learned by solving 50 Advent of Code challenges in Rust - RustNation U...
What I learned by solving 50 Advent of Code challenges in Rust - RustNation U...Luciano Mammino
 
Building an invite-only microsite with Next.js & Airtable - ReactJS Milano
Building an invite-only microsite with Next.js & Airtable - ReactJS MilanoBuilding an invite-only microsite with Next.js & Airtable - ReactJS Milano
Building an invite-only microsite with Next.js & Airtable - ReactJS MilanoLuciano Mammino
 
From Node.js to Design Patterns - BuildPiper
From Node.js to Design Patterns - BuildPiperFrom Node.js to Design Patterns - BuildPiper
From Node.js to Design Patterns - BuildPiperLuciano Mammino
 
Let's build a 0-cost invite-only website with Next.js and Airtable!
Let's build a 0-cost invite-only website with Next.js and Airtable!Let's build a 0-cost invite-only website with Next.js and Airtable!
Let's build a 0-cost invite-only website with Next.js and Airtable!Luciano Mammino
 
Everything I know about S3 pre-signed URLs
Everything I know about S3 pre-signed URLsEverything I know about S3 pre-signed URLs
Everything I know about S3 pre-signed URLsLuciano Mammino
 
Serverless for High Performance Computing
Serverless for High Performance ComputingServerless for High Performance Computing
Serverless for High Performance ComputingLuciano Mammino
 
Serverless for High Performance Computing
Serverless for High Performance ComputingServerless for High Performance Computing
Serverless for High Performance ComputingLuciano Mammino
 
JavaScript Iteration Protocols - Workshop NodeConf EU 2022
JavaScript Iteration Protocols - Workshop NodeConf EU 2022JavaScript Iteration Protocols - Workshop NodeConf EU 2022
JavaScript Iteration Protocols - Workshop NodeConf EU 2022Luciano Mammino
 
Building an invite-only microsite with Next.js & Airtable
Building an invite-only microsite with Next.js & AirtableBuilding an invite-only microsite with Next.js & Airtable
Building an invite-only microsite with Next.js & AirtableLuciano Mammino
 
Let's take the monolith to the cloud 🚀
Let's take the monolith to the cloud 🚀Let's take the monolith to the cloud 🚀
Let's take the monolith to the cloud 🚀Luciano Mammino
 
A look inside the European Covid Green Certificate - Rust Dublin
A look inside the European Covid Green Certificate - Rust DublinA look inside the European Covid Green Certificate - Rust Dublin
A look inside the European Covid Green Certificate - Rust DublinLuciano Mammino
 
Node.js: scalability tips - Azure Dev Community Vijayawada
Node.js: scalability tips - Azure Dev Community VijayawadaNode.js: scalability tips - Azure Dev Community Vijayawada
Node.js: scalability tips - Azure Dev Community VijayawadaLuciano Mammino
 
A look inside the European Covid Green Certificate (Codemotion 2021)
A look inside the European Covid Green Certificate (Codemotion 2021)A look inside the European Covid Green Certificate (Codemotion 2021)
A look inside the European Covid Green Certificate (Codemotion 2021)Luciano Mammino
 
AWS Observability Made Simple
AWS Observability Made SimpleAWS Observability Made Simple
AWS Observability Made SimpleLuciano Mammino
 
Semplificare l'observability per progetti Serverless
Semplificare l'observability per progetti ServerlessSemplificare l'observability per progetti Serverless
Semplificare l'observability per progetti ServerlessLuciano Mammino
 
Finding a lost song with Node.js and async iterators - NodeConf Remote 2021
Finding a lost song with Node.js and async iterators - NodeConf Remote 2021Finding a lost song with Node.js and async iterators - NodeConf Remote 2021
Finding a lost song with Node.js and async iterators - NodeConf Remote 2021Luciano Mammino
 
Finding a lost song with Node.js and async iterators - EnterJS 2021
Finding a lost song with Node.js and async iterators - EnterJS 2021Finding a lost song with Node.js and async iterators - EnterJS 2021
Finding a lost song with Node.js and async iterators - EnterJS 2021Luciano Mammino
 

More from Luciano Mammino (20)

Did you know JavaScript has iterators? DublinJS
Did you know JavaScript has iterators? DublinJSDid you know JavaScript has iterators? DublinJS
Did you know JavaScript has iterators? DublinJS
 
What I learned by solving 50 Advent of Code challenges in Rust - RustNation U...
What I learned by solving 50 Advent of Code challenges in Rust - RustNation U...What I learned by solving 50 Advent of Code challenges in Rust - RustNation U...
What I learned by solving 50 Advent of Code challenges in Rust - RustNation U...
 
Building an invite-only microsite with Next.js & Airtable - ReactJS Milano
Building an invite-only microsite with Next.js & Airtable - ReactJS MilanoBuilding an invite-only microsite with Next.js & Airtable - ReactJS Milano
Building an invite-only microsite with Next.js & Airtable - ReactJS Milano
 
From Node.js to Design Patterns - BuildPiper
From Node.js to Design Patterns - BuildPiperFrom Node.js to Design Patterns - BuildPiper
From Node.js to Design Patterns - BuildPiper
 
Let's build a 0-cost invite-only website with Next.js and Airtable!
Let's build a 0-cost invite-only website with Next.js and Airtable!Let's build a 0-cost invite-only website with Next.js and Airtable!
Let's build a 0-cost invite-only website with Next.js and Airtable!
 
Everything I know about S3 pre-signed URLs
Everything I know about S3 pre-signed URLsEverything I know about S3 pre-signed URLs
Everything I know about S3 pre-signed URLs
 
Serverless for High Performance Computing
Serverless for High Performance ComputingServerless for High Performance Computing
Serverless for High Performance Computing
 
Serverless for High Performance Computing
Serverless for High Performance ComputingServerless for High Performance Computing
Serverless for High Performance Computing
 
JavaScript Iteration Protocols - Workshop NodeConf EU 2022
JavaScript Iteration Protocols - Workshop NodeConf EU 2022JavaScript Iteration Protocols - Workshop NodeConf EU 2022
JavaScript Iteration Protocols - Workshop NodeConf EU 2022
 
Building an invite-only microsite with Next.js & Airtable
Building an invite-only microsite with Next.js & AirtableBuilding an invite-only microsite with Next.js & Airtable
Building an invite-only microsite with Next.js & Airtable
 
Let's take the monolith to the cloud 🚀
Let's take the monolith to the cloud 🚀Let's take the monolith to the cloud 🚀
Let's take the monolith to the cloud 🚀
 
A look inside the European Covid Green Certificate - Rust Dublin
A look inside the European Covid Green Certificate - Rust DublinA look inside the European Covid Green Certificate - Rust Dublin
A look inside the European Covid Green Certificate - Rust Dublin
 
Monoliths to the cloud!
Monoliths to the cloud!Monoliths to the cloud!
Monoliths to the cloud!
 
The senior dev
The senior devThe senior dev
The senior dev
 
Node.js: scalability tips - Azure Dev Community Vijayawada
Node.js: scalability tips - Azure Dev Community VijayawadaNode.js: scalability tips - Azure Dev Community Vijayawada
Node.js: scalability tips - Azure Dev Community Vijayawada
 
A look inside the European Covid Green Certificate (Codemotion 2021)
A look inside the European Covid Green Certificate (Codemotion 2021)A look inside the European Covid Green Certificate (Codemotion 2021)
A look inside the European Covid Green Certificate (Codemotion 2021)
 
AWS Observability Made Simple
AWS Observability Made SimpleAWS Observability Made Simple
AWS Observability Made Simple
 
Semplificare l'observability per progetti Serverless
Semplificare l'observability per progetti ServerlessSemplificare l'observability per progetti Serverless
Semplificare l'observability per progetti Serverless
 
Finding a lost song with Node.js and async iterators - NodeConf Remote 2021
Finding a lost song with Node.js and async iterators - NodeConf Remote 2021Finding a lost song with Node.js and async iterators - NodeConf Remote 2021
Finding a lost song with Node.js and async iterators - NodeConf Remote 2021
 
Finding a lost song with Node.js and async iterators - EnterJS 2021
Finding a lost song with Node.js and async iterators - EnterJS 2021Finding a lost song with Node.js and async iterators - EnterJS 2021
Finding a lost song with Node.js and async iterators - EnterJS 2021
 

Recently uploaded

Raspberry Pi 5: Challenges and Solutions in Bringing up an OpenGL/Vulkan Driv...
Raspberry Pi 5: Challenges and Solutions in Bringing up an OpenGL/Vulkan Driv...Raspberry Pi 5: Challenges and Solutions in Bringing up an OpenGL/Vulkan Driv...
Raspberry Pi 5: Challenges and Solutions in Bringing up an OpenGL/Vulkan Driv...Igalia
 
Presentation on how to chat with PDF using ChatGPT code interpreter
Presentation on how to chat with PDF using ChatGPT code interpreterPresentation on how to chat with PDF using ChatGPT code interpreter
Presentation on how to chat with PDF using ChatGPT code interpreternaman860154
 
Advantages of Hiring UIUX Design Service Providers for Your Business
Advantages of Hiring UIUX Design Service Providers for Your BusinessAdvantages of Hiring UIUX Design Service Providers for Your Business
Advantages of Hiring UIUX Design Service Providers for Your BusinessPixlogix Infotech
 
From Event to Action: Accelerate Your Decision Making with Real-Time Automation
From Event to Action: Accelerate Your Decision Making with Real-Time AutomationFrom Event to Action: Accelerate Your Decision Making with Real-Time Automation
From Event to Action: Accelerate Your Decision Making with Real-Time AutomationSafe Software
 
How to Troubleshoot Apps for the Modern Connected Worker
How to Troubleshoot Apps for the Modern Connected WorkerHow to Troubleshoot Apps for the Modern Connected Worker
How to Troubleshoot Apps for the Modern Connected WorkerThousandEyes
 
08448380779 Call Girls In Diplomatic Enclave Women Seeking Men
08448380779 Call Girls In Diplomatic Enclave Women Seeking Men08448380779 Call Girls In Diplomatic Enclave Women Seeking Men
08448380779 Call Girls In Diplomatic Enclave Women Seeking MenDelhi Call girls
 
Finology Group – Insurtech Innovation Award 2024
Finology Group – Insurtech Innovation Award 2024Finology Group – Insurtech Innovation Award 2024
Finology Group – Insurtech Innovation Award 2024The Digital Insurer
 
08448380779 Call Girls In Greater Kailash - I Women Seeking Men
08448380779 Call Girls In Greater Kailash - I Women Seeking Men08448380779 Call Girls In Greater Kailash - I Women Seeking Men
08448380779 Call Girls In Greater Kailash - I Women Seeking MenDelhi Call girls
 
A Call to Action for Generative AI in 2024
A Call to Action for Generative AI in 2024A Call to Action for Generative AI in 2024
A Call to Action for Generative AI in 2024Results
 
GenCyber Cyber Security Day Presentation
GenCyber Cyber Security Day PresentationGenCyber Cyber Security Day Presentation
GenCyber Cyber Security Day PresentationMichael W. Hawkins
 
Handwritten Text Recognition for manuscripts and early printed texts
Handwritten Text Recognition for manuscripts and early printed textsHandwritten Text Recognition for manuscripts and early printed texts
Handwritten Text Recognition for manuscripts and early printed textsMaria Levchenko
 
Powerful Google developer tools for immediate impact! (2023-24 C)
Powerful Google developer tools for immediate impact! (2023-24 C)Powerful Google developer tools for immediate impact! (2023-24 C)
Powerful Google developer tools for immediate impact! (2023-24 C)wesley chun
 
Artificial Intelligence: Facts and Myths
Artificial Intelligence: Facts and MythsArtificial Intelligence: Facts and Myths
Artificial Intelligence: Facts and MythsJoaquim Jorge
 
Automating Google Workspace (GWS) & more with Apps Script
Automating Google Workspace (GWS) & more with Apps ScriptAutomating Google Workspace (GWS) & more with Apps Script
Automating Google Workspace (GWS) & more with Apps Scriptwesley chun
 
Bajaj Allianz Life Insurance Company - Insurer Innovation Award 2024
Bajaj Allianz Life Insurance Company - Insurer Innovation Award 2024Bajaj Allianz Life Insurance Company - Insurer Innovation Award 2024
Bajaj Allianz Life Insurance Company - Insurer Innovation Award 2024The Digital Insurer
 
Factors to Consider When Choosing Accounts Payable Services Providers.pptx
Factors to Consider When Choosing Accounts Payable Services Providers.pptxFactors to Consider When Choosing Accounts Payable Services Providers.pptx
Factors to Consider When Choosing Accounts Payable Services Providers.pptxKatpro Technologies
 
A Domino Admins Adventures (Engage 2024)
A Domino Admins Adventures (Engage 2024)A Domino Admins Adventures (Engage 2024)
A Domino Admins Adventures (Engage 2024)Gabriella Davis
 
Tata AIG General Insurance Company - Insurer Innovation Award 2024
Tata AIG General Insurance Company - Insurer Innovation Award 2024Tata AIG General Insurance Company - Insurer Innovation Award 2024
Tata AIG General Insurance Company - Insurer Innovation Award 2024The Digital Insurer
 
Mastering MySQL Database Architecture: Deep Dive into MySQL Shell and MySQL R...
Mastering MySQL Database Architecture: Deep Dive into MySQL Shell and MySQL R...Mastering MySQL Database Architecture: Deep Dive into MySQL Shell and MySQL R...
Mastering MySQL Database Architecture: Deep Dive into MySQL Shell and MySQL R...Miguel Araújo
 
How to convert PDF to text with Nanonets
How to convert PDF to text with NanonetsHow to convert PDF to text with Nanonets
How to convert PDF to text with Nanonetsnaman860154
 

Recently uploaded (20)

Raspberry Pi 5: Challenges and Solutions in Bringing up an OpenGL/Vulkan Driv...
Raspberry Pi 5: Challenges and Solutions in Bringing up an OpenGL/Vulkan Driv...Raspberry Pi 5: Challenges and Solutions in Bringing up an OpenGL/Vulkan Driv...
Raspberry Pi 5: Challenges and Solutions in Bringing up an OpenGL/Vulkan Driv...
 
Presentation on how to chat with PDF using ChatGPT code interpreter
Presentation on how to chat with PDF using ChatGPT code interpreterPresentation on how to chat with PDF using ChatGPT code interpreter
Presentation on how to chat with PDF using ChatGPT code interpreter
 
Advantages of Hiring UIUX Design Service Providers for Your Business
Advantages of Hiring UIUX Design Service Providers for Your BusinessAdvantages of Hiring UIUX Design Service Providers for Your Business
Advantages of Hiring UIUX Design Service Providers for Your Business
 
From Event to Action: Accelerate Your Decision Making with Real-Time Automation
From Event to Action: Accelerate Your Decision Making with Real-Time AutomationFrom Event to Action: Accelerate Your Decision Making with Real-Time Automation
From Event to Action: Accelerate Your Decision Making with Real-Time Automation
 
How to Troubleshoot Apps for the Modern Connected Worker
How to Troubleshoot Apps for the Modern Connected WorkerHow to Troubleshoot Apps for the Modern Connected Worker
How to Troubleshoot Apps for the Modern Connected Worker
 
08448380779 Call Girls In Diplomatic Enclave Women Seeking Men
08448380779 Call Girls In Diplomatic Enclave Women Seeking Men08448380779 Call Girls In Diplomatic Enclave Women Seeking Men
08448380779 Call Girls In Diplomatic Enclave Women Seeking Men
 
Finology Group – Insurtech Innovation Award 2024
Finology Group – Insurtech Innovation Award 2024Finology Group – Insurtech Innovation Award 2024
Finology Group – Insurtech Innovation Award 2024
 
08448380779 Call Girls In Greater Kailash - I Women Seeking Men
08448380779 Call Girls In Greater Kailash - I Women Seeking Men08448380779 Call Girls In Greater Kailash - I Women Seeking Men
08448380779 Call Girls In Greater Kailash - I Women Seeking Men
 
A Call to Action for Generative AI in 2024
A Call to Action for Generative AI in 2024A Call to Action for Generative AI in 2024
A Call to Action for Generative AI in 2024
 
GenCyber Cyber Security Day Presentation
GenCyber Cyber Security Day PresentationGenCyber Cyber Security Day Presentation
GenCyber Cyber Security Day Presentation
 
Handwritten Text Recognition for manuscripts and early printed texts
Handwritten Text Recognition for manuscripts and early printed textsHandwritten Text Recognition for manuscripts and early printed texts
Handwritten Text Recognition for manuscripts and early printed texts
 
Powerful Google developer tools for immediate impact! (2023-24 C)
Powerful Google developer tools for immediate impact! (2023-24 C)Powerful Google developer tools for immediate impact! (2023-24 C)
Powerful Google developer tools for immediate impact! (2023-24 C)
 
Artificial Intelligence: Facts and Myths
Artificial Intelligence: Facts and MythsArtificial Intelligence: Facts and Myths
Artificial Intelligence: Facts and Myths
 
Automating Google Workspace (GWS) & more with Apps Script
Automating Google Workspace (GWS) & more with Apps ScriptAutomating Google Workspace (GWS) & more with Apps Script
Automating Google Workspace (GWS) & more with Apps Script
 
Bajaj Allianz Life Insurance Company - Insurer Innovation Award 2024
Bajaj Allianz Life Insurance Company - Insurer Innovation Award 2024Bajaj Allianz Life Insurance Company - Insurer Innovation Award 2024
Bajaj Allianz Life Insurance Company - Insurer Innovation Award 2024
 
Factors to Consider When Choosing Accounts Payable Services Providers.pptx
Factors to Consider When Choosing Accounts Payable Services Providers.pptxFactors to Consider When Choosing Accounts Payable Services Providers.pptx
Factors to Consider When Choosing Accounts Payable Services Providers.pptx
 
A Domino Admins Adventures (Engage 2024)
A Domino Admins Adventures (Engage 2024)A Domino Admins Adventures (Engage 2024)
A Domino Admins Adventures (Engage 2024)
 
Tata AIG General Insurance Company - Insurer Innovation Award 2024
Tata AIG General Insurance Company - Insurer Innovation Award 2024Tata AIG General Insurance Company - Insurer Innovation Award 2024
Tata AIG General Insurance Company - Insurer Innovation Award 2024
 
Mastering MySQL Database Architecture: Deep Dive into MySQL Shell and MySQL R...
Mastering MySQL Database Architecture: Deep Dive into MySQL Shell and MySQL R...Mastering MySQL Database Architecture: Deep Dive into MySQL Shell and MySQL R...
Mastering MySQL Database Architecture: Deep Dive into MySQL Shell and MySQL R...
 
How to convert PDF to text with Nanonets
How to convert PDF to text with NanonetsHow to convert PDF to text with Nanonets
How to convert PDF to text with Nanonets
 

Luciano Mammino on embracing streams for large file processing