Here are my notes on the talks of NodeConf AR day 2.
Node on the Desktop: Building Apps with Electron v1
He introduced Electron, a framework to build cross platform desktop apps. Electron is composed of
C++. Felix explained the basics of the framework and showed a cool demo on how to create a simple text editor using monaco-editor (which was originally used to create Visual Studio Code).
Kill all humans — Automate npm releases and dependency updates
From Berlin, Stephan Bönnemann @boennemann got to the stage to talk about automation and how we can reduce human errors on trivial tasks. He introduced his tool called greenkeeper which monitors npm dependencies in real-time and allows to upgrade them automatically.
He also mentioned the possibility to integrate semantic-release module with greenkeeper in order to automate the generation of changelog files from commit comments. This library follows the SemVer specification.
Build cool things, without breaking things.
Internet of Peers
Mathias Buss @mafintosh lives in Copenhagen, Denmark and he’s the author of exactly 488 npm modules and counting. He works at Dat Data and he’s crazy about P2P. He gave a really funny and interesting presentation using a lot of emojis and a few words. You can check the slides here.
He invited us to get rid of the servers since they’re boring, failure-prone, expensive and dependant of Ops teams. P2P connections have some difficulties which can be dealt with some of his modules/tools:
- Test and report p2p connectivity on the current network: p2p-test
- An electron demo app to stream P2P.
This was one of the most interesting parts of Day 2. @mafintosh, @thlorenz, @felixrieseberg and @adam_baldwin got to the stage to discuss topics proposed by the audience and organizers. Here are some notes of the discussions:
How to start contributing to Open Source communities
- Copy something and put it on Github to start gaining experience. Eventually you’ll get some contributors.
- Follow Github “Your first PR project”.
- If you want to contribute to a project, open an issue on Github to get feedback and validate if the project is going in that direction.
- If you have your own module, try to write enough tests to help you review and verify PRs.
JS & NodeJS tooling
- Most of them like to keep tools simple.
The evolution of the language
- If you want to upgrade, analyze trade-offs and think about the cost of making the change.
- Some of them were not aligned with the direction that ECMAScript is taking. “There a race toward the style of the language”.
- Felix in particular, was in favor of keeping the language evolving and adding new features.
- Thorsten recommended to always try to be aware of how ES features are translated to plain JS code.
Cool things about NodeJS
- JS entry is really easy.
- Huge ecosystem.
- Possibility to reuse components.
- NodeJS can run C code and make native calls.
Adam Baldwin @adam_baldwin is the Chief Security Officer at &yet and the Team Lead at lift security. He got to the stage to talk about how we should treat security in the development of our projects. “Security is not binary, it’s always evolving”. So we have to follow a Continuous Security approach.
Continuous Security mindset:
- Keep vulnerabilities out of production: Risks increase if vulnerable code gets to production environments. Think ahead about what could go wrong and prevent it.
- Understand flaws: Think beyond tests and tweak them to add scenarios where things can go wrong.
- Challenge assumptions: Dependencies, libraries and frameworks change under the hood. Do not blindly accept new versions without understanding what has been changed.
Adam is the founder of the Node Security Project. It allows to add security checks right into your GitHub pull request flow. He also mentioned some other useful tools to measure security:
Feross Aboukhadijeh @feross is a well-known open source contributor. He’s the creator of WebTorrent, a streaming torrent client for NodeJS and the browser. Right now, he’s working on the desktop version using Electron and he talked about his experience on shipping cross-platform desktop apps.
“WebTorrent Desktop is 99% the same code as the web version, but there are some important differences.”
- Use desktop default cursor
- Use the menu bar
- Mac apps quit differently from windows/linux apps.
- Code Signing: You need to buy a certificate to avoid authorization alerts. Symantec is the most reliable company which is selling them.
- BUILDing your app:
- Try to avoid native dependencies (in order to build it easier). There are so many linux distributions.
- Which platforms should you support? He recommended to have a 32-bit Windows installer, Mac disk image 64-bits and linux deb file (both 32 & 64 bits)
- Feross created a module called ‘arch’ which allows to detect OS architecture.
- UPDATES: Use autoUpdater electron component to upgrade versions. To avoid prompts use silent updates.
- SECURITY: Always use HTTPS connections. Keep your server secure (endpoint security).
- DOWNLOAD PERFORMANCE:
- Use CDN
- Keep dependencies light
- Ignore unnecessary files
- Load dependencies for the critical path. Load the rest later.
- Catch runtime error. It’s better to bubble them using
- Use telemetry techniques.
- Catch runtime error. It’s better to bubble them using
Measure — Release — Repeat
He started his presentation talking about local communities and how we can participate to make them bigger. I agree 100% with him that companies should support them since they are making our profession and people we work with better.
Diego described the micro-services architecture characteristics and challenges. And he presented a new perspective to develop P2P based micro-services architecture having in mind three aspects:
- Discovery: A service discovery layer using libraries such as bittorrent-dht and discovery-swarm
- Visibility: Each service should expose data to analyze. Distributed health checks.
- Additive deployments: Add new features and limit its traffic until they are mature enough to support all the scenarios and load correctly.
- Canary release: Push code changes to a small group of end users who are unaware that they are receiving new code.
- Measure deployments: Add metrics (such as code coverage tests and others) to give us useful insights to make decisions.
Diego ended his talk commenting why he’s using hapi.js as the main framework to develop these services. His main reasons:
- Configuration oriented.
- No middleware.
- Modularity, code is written as plugins.
- It has a great community.
NodeConf AR 2016 ended with two interesting talks.
@mmatuzak talked about his experience hacking his Nintendo. At the beginning he thought it was impossible but after a lot of research, trial and errors, frustration, collaboration with other people made it work. He also created a few modules to work with the console (nesly-sound, nesly-assembler).
@substack closed the event live coding some cool examples using WebGL. This is the code he crafted live. He explained line by line what he was doing. He also commented on the main WebGL problem which is that it doesn’t show error messages. Instead of telling you what’s wrong, you don’t see anything.
NodeConf AR was an incredible event and just the first one! Congratulations to the organizers and I hope they do it again next year keeping all the amazing stuff.
After two long days of presentations, discussions and a lot of information, we came back to La Plata. First we shared the experience and lessons with the rest of NaNLABS team members at our offices. And it already triggered new light talks and workshops to discuss, create and imagine.