2019-01-30 Engineering Meeting (Open to Public)


#1

This is an open-to-the-public meeting. Our engineering team meets and all everyone in the community is welcome — app developers, app users, and the simply curious. You can listen in, ask a question, or participate in our discussions of engineering concerns or questions.

Date/Time: 2019-01-30 @ 15:00 UTC / 10:00 EDT / 23:00 HKT
Click here to convert to your time zone
Length: 45 minutes
Meeting link: [https://zoom.us/j/9468184266 ]

Agenda

Please reply to this post with items you would like included on the agenda.

Meeting Minutes

Recording of this meeting

Topics from minutes

Discussion on the QA

  • Jude expressed skepticism on goal 3
  • The team delved

Release change notification

  • Mailing list might be better
  • Occasional emails rather than relying on checking
  • Decided to try with a Forum post that user can subscribe to the posting

On call discussion

  • Where to find internally who is on call
  • On call people will phone tree
  • Internal doc for the compassionate exercise of discretion with regard to escalation

Meeting concluded at 7:34


#2

#3

Quality Assurance Program

We want to get feedback about our plan for improving our QA process this quarter.

Importantly, we want to get feedback from the community about how best to integrate their applications with our QA process. Can integrate your application into our automated test framework? Do you already have tests in place that we can pull from?

The Problem

As we try to deliver on our promises of providing an application platform to developers, encourage developers to become true fans, grow our user base, and not scare aware people who would otherwise be interested in our platform, it is existentially important that we have a platform that people feel “works”.

It’s important to deliver on this now, as we’re experiencing the largest growth in developers, apps, and users that we’ve experienced as a platform ever. All of our other priorities, at their basic level, depend on this platform being stable — not just the infrastructure we provide, but importantly the software and tools as well.

Goals

  • Deliver QA processes for four of our projects (blockstack.js, gaia, browser, core)
  • Ensure that no PRs are merged that do not comply with this process
  • Integrate 5 community applications (optimally, with feedback from the developer) into our QA process

Learnings, data, previous briefs

Roughly speaking, we previously enforced a QA process through the PR mechanism described here: https://github.com/blockstack/blockstack/tree/master/engineering#pull-requests — this system was very ad hoc. Every PR requires reviews, but it’s not clear exactly how a review should be conducted. Is it strictly a code review? Are we enforcing code styles? Do we need unit tests? Integration tests? How should a reviewer conduct end-to-end reviews. Because of the ad hoc process we’ve had in place, we’ve had definitely mixed results in our releases. Some are smooth, painless, and go completely unnoticed by our developers. Others result in multiple stop-everything usability or security bugs.

From these releases, we’ve learned a few things:

  1. We need much more detailed processes for the review and testing requirements prior to releases. These processes will necessarily be project dependent.
  2. Blockstack team members need to be “on call” to determine if a given bug report needs to be elevated to the emergency level. Even with the “on call” engineer, we cannot ship a release that we will not have the engineering capacity to deal with the consequences of (i.e., no releases before a long break.)
  3. The decentralized nature of data on our platform makes fixing bugs hard! Security issues require coordinated efforts. Bugs in blockstack.js could require applications to update across a variety of platforms.
  4. On some level, lacking this process has slowed down progress on platform features — we hold off on releases because gaining the confidence that a release won’t break something requires a lot of effort and focused reviewing. With a better process, we should be able to have both more confidence in our releases and more frequent releases.