NodeConf Ar 2016 – Day 2
Node on the Desktop, Internet of peers, Continuous Security and more.
On Saturday November 19th, we were at the entrance of Ciudad Cultural Konex at 9am excited about the 2nd day of the conference. Day 1 was intense and motivating and we wanted more!
Here are my notes on the talks of NodeConf AR day 2.
Node on the desktop: building apps with electron v1
Felix Rieseberg @felixrieseberg is one of the four engineers that crafted the Slack desktop version. With an auditorium packed with Slack users, Felix was impressed by how widely used it was.
He introduced Electron, a framework to build cross platform desktop apps. Electron is composed of NodeJS
+ Chromium
+ C++
. Felix explained the basics of the framework and showed a cool demo on how to create a simple text editor using monaco-editor (which was originally used to create Visual Studio Code).
Kill all humans — automate npm releases and dependency updates
From Berlin, Stephan Bönnemann @boennemann got to the stage to talk about automation and how we can reduce human errors on trivial tasks. He introduced his tool called greenkeeper which monitors npm dependencies in real-time and allows to upgrade them automatically.
He also mentioned the possibility to integrate semantic-release module with greenkeeper in order to automate the generation of changelog files from commit comments. This library follows the SemVer specification.
Build cool things, without breaking things.
Internet of peers
Mathias Buss @mafintosh lives in Copenhagen, Denmark and he’s the author of exactly 488 npm modules and counting. He works at Dat Data and he’s crazy about P2P. He gave a really funny and interesting presentation using a lot of emojis and a few words.
He invited us to get rid of the servers since they’re boring, failure-prone, expensive and dependant of Ops teams. P2P connections have some difficulties which can be dealt with some of his modules/tools:
Test and report p2p connectivity on the current network: p2p-test
An electron demo app to stream P2P.
Debate panel
This was one of the most interesting parts of Day 2. @mafintosh, @thlorenz, @felixrieseberg and @adam_baldwin got to the stage to discuss topics proposed by the audience and organizers. Here are some notes of the discussions:
How to start contributing to open source communities
Copy something and put it on Github to start gaining experience. Eventually you’ll get some contributors.
Follow Github.
If you want to contribute to a project, open an issue on Github to get feedback and validate if the project is going in that direction.
If you have your own module, try to write enough tests to help you review and verify PRs.
JS & NodeJS tooling
Most of them like to keep tools simple.
The evolution of the language
If you want to upgrade, analyze trade-offs and think about the cost of making the change.
Some of them were not aligned with the direction that ECMAScript is taking. “There a race toward the style of the language”.
Felix in particular, was in favor of keeping the language evolving and adding new features.
Thorsten recommended to always try to be aware of how ES features are translated to plain JS code.
Cool things about NodeJS
JS entry is really easy.
V8 JavaScript Engine is extremely fast.
Huge ecosystem.
Hardware, embedded systems, IoT and other devices are running JavaScript code.
Portability
Isomorphic JavaScript apps are JavaScript applications that can run both client-side and server-side.
Possibility to reuse components.
NodeJS can run C code and make native calls.
Continuous security
Adam Baldwin @adam_baldwin is the Chief Security Officer at &yet and the Team Lead at lift security. He got to the stage to talk about how we should treat security in the development of our projects. “Security is not binary, it’s always evolving”. So we have to follow a Continuous Security approach.
Continuous security mindset:
Keep vulnerabilities out of production: Risks increase if vulnerable code gets to production environments. Think ahead about what could go wrong and prevent it.
Understand flaws: Think beyond tests and tweak them to add scenarios where things can go wrong.
Challenge assumptions: Dependencies, libraries and frameworks change under the hood. Do not blindly accept new versions without understanding what has been changed.
Adam is the founder of the Node Security Project. It allows to add security checks right into your GitHub pull request flow. He also mentioned some other useful tools to measure security:
Real world electron: building cross-platform desktop apps with Javascript
Feross Aboukhadijeh @feross is a well-known open source contributor. He’s the creator of WebTorrent, a streaming torrent client for NodeJS and the browser. Right now, he’s working on the desktop version using Electron and he talked about his experience on shipping cross-platform desktop apps.
“WebTorrent Desktop is 99% the same code as the web version, but there are some important differences.”
UX:
Use desktop default cursor
Use the menu bar
Mac apps quit differently from windows/linux apps.
Code Signing: You need to buy a certificate to avoid authorization alerts. Symantec is the most reliable company which is selling them.
BUILDing your app:
Try to avoid native dependencies (in order to build it easier). There are so many linux distributions.
Which platforms should you support? He recommended to have a 32-bit Windows installer, Mac disk image 64-bits and linux deb file (both 32 & 64 bits)
Feross created a module called ‘arch’ which allows to detect OS architecture.
UPDATES: Use autoUpdater electron component to upgrade versions. To avoid prompts use silent updates.
SECURITY: Always use HTTPS connections. Keep your server secure (endpoint security).
DOWNLOAD PERFORMANCE:
Use CDN
Keep dependencies light
Ignore unnecessary files
Load dependencies for the critical path. Load the rest later
MEASURE:
Catch runtime error. It’s better to bubble them using
process.on("error")
Use telemetry techniques.
Measure — Release — Repeat
Micro(hapi)ness
Originally from Tierra del Fuego, Diego Paez @carax lives in La Plata and he’s the co-founder of LaPlataJS. He was representing our city and we were excited about his talk.
He started his presentation talking about local communities and how we can participate to make them bigger. I agree 100% with him that companies should support them since they are making our profession and people we work with better.
Diego described the micro-services architecture characteristics and challenges. And he presented a new perspective to develop P2P based micro-services architecture having in mind three aspects:
Discovery: A service discovery layer using libraries such as bittorrent-dht and discovery-swarm
Visibility: Each service should expose data to analyze. Distributed health checks.
Resiliency:
Additive deployments: Add new features and limit its traffic until they are mature enough to support all the scenarios and load correctly.
Canary release: Push code changes to a small group of end users who are unaware that they are receiving new code.
Measure deployments: Add metrics (such as code coverage tests and others) to give us useful insights to make decisions.
Diego ended his talk commenting why he’s using hapi.js as the main framework to develop these services. His main reasons:
Configuration oriented.
No middleware.
Modularity, code is written as plugins.
It has a great community.
The end
NodeConf AR 2016 ended with two interesting talks.
@mmatuzak talked about his experience hacking his Nintendo. At the beginning he thought it was impossible but after a lot of research, trial and errors, frustration, collaboration with other people made it work. He also created a few modules to work with the console (nesly-sound, nesly-assembler).
@substack closed the event live coding some cool examples using WebGL. He also commented on the main WebGL problem which is that it doesn’t show error messages. Instead of telling you what’s wrong, you don’t see anything.
NodeConf AR was an incredible event and just the first one! Congratulations to the organizers and I hope they do it again next year keeping all the amazing stuff.
After two long days of presentations, discussions and a lot of information, we came back to La Plata. First we shared the experience and lessons with the rest of NaNLABS team members at our offices. And it already triggered new light talks and workshops to discuss, create and imagine.