Safe Browsing in Epiphany

I am pleased to announce that Epiphany users will now benefit from a safe browsing support which is capable to detect and alert users whenever they are visiting a potential malicious website. This feature will be shipped in GNOME 3.28, but those who don’t wish to wait that long can go ahead and build Epiphany from master to benefit from it.

The safe browsing support is enabled by default in Epiphany, but you can always disable it from the preferences dialog by toggling the checkbox under General -> Web Content -> Try to block dangerous websites.

Safe browsing is implemented with the help of Google’s Safe Browsing Update API v4. How this works: the URL’s hash prefix is tested against a local database of unsafe hash prefixes, and if a match is found then the full hash is further requested from the Google Safe Browsing server to be compared to the URL’s full hash. If the full hashes are equal, then the URL is considered unsafe. Of course, all hash prefixes and full hashes are cached for a certain amount of time, in order to minimize the number of requests sent to the server. Needless to say that working only with URL hashes brings a big privacy bonus since Google never knows the actual URLs that clients browse. The whole description of the API can be found here.

GUADEC 2017

This year’s GUADEC came a bit unexpectedly for me. I wasn’t really planning to attend it because of my school and work, but when Iulian suggested that we should go, I didn’t have to think twice and agreed immediately. And I was not disappointed! Travelling to Manchester proved to be a great vacation where I could not only enjoy a few days off but also learn things and meet new and old friends.

Much like last year’s GUADEC, I attended some of the talks during the core days where I got to find out more things about new technologies such as Flatpak, Meson, BuildStream (I’m really looking forward to seeing how this one turns out in the future) etc, and also about the GNOME history and future prospects.

One of this year’s social events was GNOME’s 20th anniversary party held Saturday night at Museum of Science and Industry. I have to thank the organization team for arranging such a great party and taking care of everything. This was definitely the highlight of this year!

As usual, I’m gonna lay a few words about the location that we were in – Manchester. I found Manchester a nice and cozy city, packed with everything: universities, museums, parks, and restaurants of all kinds for all tastes. The weather was not the best that you can get, with all the rainy and sunny sessions that alternate on an hourly basis, but I guess that’s typical for UK. Overall, I think that Manchester is an interesting city where one would not ever get bored.

Thanks again to the GUADEC team and to GNOME for hosting such an awesome event!

sponsored-badge-shadow

GUADEC 2016

One of the perks of being a GSoC student for GNOME is that you get to be invited to the anual GNOME Users And Developers European Conference. Therefore, I had the pleasure of travelling to Karlsruhe, Germany together with other summer students colleagues and have a really amazing week.

Not only I got the opportunity to meet my mentor, Michael Catanzaro, and other people from the community too, but also I got to learn more about the whole GNOME stack and how to use it to its full power. All the presentations and talks that I attended have proven really enlightening!

More than this, Karlsruhe is a beautiful city. I enjoyed all its pubs, restaurants and parks that I had the pleasure to go to, but I was really impressed by the Karlsruhe Zoo and Karlsruhe Palace which are quite amazing places to visit.

I can easily say that this was the best part of the summer. Many thanks to GNOME for its sponsorship and I hope I’ll be able to attend GUADEC next year in Manchester too!

sponsored-badge-simple

GSoC 2016: Final report

Picking up where I left in my previous post. Shortly after GUADEC, I managed to implement the sync logic, which proved a bit tricky, but worked out well in the end. Last week I’ve asked my mentor, Michael, to review my code, so for the past few days I’ve worked to fix the things that he suggested through his review comments.

Since my code heavily relies on Mozilla’s protocols (thus it may appear a bit confusing for someone that is not already familiar with it), Michael suggested that I should write some thorough documentation/comments for the important functions, so that will be my next step for the following days.

Hopefully, all of my work will go into master in the next weeks, but only after Iulian is finished with his work on bookmarks, since a relevant part of my code relies on the new bookmarks code.

Currently, the bookmarks are the only items that are synced between different Epiphany instances, therefore, some future tasks would be to sync the other important items too, such as history/password/tabs, but also enhance the current code, since there is always room for improvement.

Google Summer of Code has ended now, and I want to thank GNOME for giving me the chance to be part of the community and do some great work, but also thank Michael and Iulian for supporting me and guiding me through the whole summer. I hope that this is only the beginning of my involvement with GNOME, and that I get the opportunity to work with as many of you in the future! 🙂

 

GSoC 2016: Progress #5

Like I said in my previous post, the final part of my project represents the implementation of the actual Sync logic. This is done exclusively by sharing data with the Storage Server. Since Mozilla’s Storage Server does not support push notifications, this is going to require a bit of tinkering from my part in order to make the Sync work correctly.

What you need to know about the Storage Server is that it is essentially a dumb storage bucket – it only does what you tell him. Therefore, most of the complexity of Sync is the responsibility of the client. This is good for users’ data security, but can be bad for people implementing Sync clients 🙂

What else you need to know about the Storage Server is how it stores the data. The base elements are the simple objects so called Basic Storage Objects (BSOs), which are organized into named collections. A BSO is the generic JSON wrapper around all items passed into and out of the Storage Server and is assigned to a collection with other related BSOs (i.e. the Bookmarks collections, the History collection etc).

Among other optional fields, every BSO contains the following mandatory fields:

  • id – an identifying string which must be unique within a collection.
  • payload – a string containing the data of the record.
  • modified – the timestamp at which the BSO was last modified, set by the server.

As for talking to the Storage Server, we just send specific HAWK signed HTTP requests (i.e. GET, POST, DELETE etc) to a given collection endpoint or to a given BSO endpoint.

Since I will only deal with bookmarks sync for the moment, I’ll continue to talk from the bookmarks’ point of view. Maybe now is a good time to state that my current work is highly dependent to Iulian’s work, who is doing the Bookmarks Subsystem Update, therefore I had to rebase his branch into mine so that I can work with the new bookmarks code.

Before a bookmark is uploaded to the server, there are a few steps that we need to take into consideration:

  1. Serialize. For an object to be serializable, it has to implement GLib’s Serializable Interface, so that’s what we did for EphyBookmark too.
  2. Encrypt. To be more specific, this is an AES 256 encryption using the Sync Key retrieved from the FxA Server.
  3. URL safe encode. Since the BSO’s payload is a string that is sent over HTTP, we can’t send the raw encrypted bytes, so we have to base64 url-safe encode it.

Next, we create the BSO with the given id and payload, send it to the server, and set the modified value as returned by the server. Obviously, when downloading a BSO from the server, the steps are going in reverse order: decode -> decrypt -> deserialize.

OK, now that you know how to interact with the Storage Server, back to the Sync logic. I’m not sure if I’ll have time to finish implementing it before GUADEC, maybe I’ll do it there during one of the BOFs, who knows?

However, the actual Sync process should look something along the lines:

  • The user signs in.
  • Retrieve the Sync Key from the FxA Server.
  • Retrieve the storage endpoint and credentials from the Token Server.
  • Merge the local bookmarks with the remote ones from the Storage Server, if any.
  • Every time a bookmark is added/edited (re)upload it to the server too.
  • Every time a bookmark is deleted delete it from the server too.
  • Periodically check the server for changes to the Bookmarks collection. If any, mirror them to the local instance (this is going to prove a bit tricky for deletion, since we can no longer track a BSO that has been deleted).

That’s it for the moment, see you at GUADEC!

GSoC 2016: Progress #4

In order to have a working form of sync with the help of the Mozilla servers, there are mainly three steps that need to be taken:

  1. Obtain a sessionToken and a keyFetchToken from the Firefox Accounts Server. These are automatically sent by the server upon sign in. The former allows us to obtain a signed certificate needed to talk to the Token Server, while the latter allows us to retrieve the sync keys needed to encrypt/decrypt synchronized data records.
  2. Obtain the storage endpoint, together with the storage credentials (id + key) from the Token Server. The storage endpoint represents the URL of the Storage Server that is assigned to the user upon the creation of the account. The storage credentials are used to sign all the HAWK requests sent to the Storage Server.
  3. Develop an algorithm based on multiple GET/PUT/POST/DELETE requests to the Storage Server to implement the actual sync logic. This should not only keep the data up to date on both the server and the remote clients, but also resolve any conflicts that may appear between clients due to concurrent requests.

As I’ve mentioned in my previous post, step #1 is complete, so the last couple of weeks I have focused on steps #2 and #3. I’ll only talk about #2 now and leave #3 be the subject of another post later this week.

OK so, in order to talk to the Token Server, one must possess a so called signed BrowserID assertion. This is basically a signed certificate that Mozilla enforces in order to convince subsequent relying parties that we control the account. In order to obtain one, we need to do the following:

  • Derive the sessionToken into the tokenID.
  • Generate a random RSA key pair.
  • Use the RSA public key together with the tokenID to sign the HAWK request to the certificate/sign endpoint of the FxA Server.
  • Check if the certificate received from the server is valid (i.e. contains the correct uid and algorithm).
  • From the certificate and the RSA key pair, generate the BrowserID assertion for the URL of the Token Server.

Next, we append the previously computed BrowserID to the ‘authorization’ header of the request that is going to be sent to the Token Server. If everything is OK, the server will reply the endpoint of the Storage Server, together with an one-hour-valid storage credentials. As stated before, these will be later used to sign all the requests to the Storage Server.

I’ve only described brief parts of the algorithms, but I think this is just enough to get an idea of how the client-server communication works. Stay tuned for the next posts!

GSoC 2016: Progress #3

My last week has been quite busy, but it all paid off in the end as I’ve managed to overcome the issue that I had with the login phase. Thankfully, I was able to take a look at how the postMessage() API is used to do the login in Firefox iOS and implement it myself in Epiphany.

To summarize it, this is how it’s done:

  1. Load the FxA iframe with the service=sync parameter in a WebKitWebView.
  2. Inject a few JavaScript lines to listen to FirefoxAccountsCommand events (sent by the FxA Server). This is done with a WebKitUserContentManager and a WebKitUserScript.
  3. In the event listener use postMessage() to send to back to WebKit the data received from the server.
  4. In the C code, register a script message handler with a callback that gets called whenever something is sent through the postMessage() channel. This is done with webkit_user_content_manager_register_script_message_handler().
  5. In the callback you now hold the server’s response to your request. This includes all the tokens you need to retrieve the sync keys.
  6. Profit!

Basically, postMessage() acts like a forwarder between JavaScript and WebKit. Cool!

With this new sign in method, users can also benefit of the possibility to create new Firefox accounts. The iframe contains a “Create an account” link that shows a form by which the users will create a new account. The user will have to verify the account before he signs in.

GSoC 2016: Progress #2

Unexpected things happen all the time and plans get damaged accordingly. That was my case too: since around July 1, the Firefox Accounts server started to block direct requests to the account/login endpoint. As you may assume, this was (and still is) a real impairment to my project’s progress.

I had nothing to do but to ask for answers on the Mozilla’s sync-dev mailing list: turns out the Firefox Accounts service is currently operating with tightened security rules due to an uptick in suspicious login attempts. Unfortunately, this also increases the likelihood of blocking legitimate login attempts, if they appear too similar to the suspicious traffic.

The guys at Mozilla suggested that I should change my approach and use the JavaScript postMessage() API in order to login, rather than the protocol I’m using now, since their intention is to deprecate this protocol for general public use in the near future. Therefore, my next step is to try to implement this new login method in Epiphany. This requires a good amount of documentation to determine whether it is feasible for Epiphany in the first place, therefore my project is getting delayed accordingly.

The login phase is a crucial part to the sync flow, since this is how you obtain the sessionToken and the keyFetchToken, used to further retrieve the sync keys that enable the client to encrypt/decrypt synchronized data records.

Since I’ve pretty much spent my time on mailing lists and digging through Firefox code, I wasn’t able to make a noticeable progress in the last week. However, I managed to achieve two things:

  • implement a shortcuts window for Epiphany (during the downtime of my actual project). Shortcuts windows were introduced in GNOME 3.20 and provide a way for the user to see all the keyboard shortcuts for an application with their corresponding actions. Many applications had their shortcuts window ready before the 3.20 release, but Epiphany was not one of them.
  • implement the Fetching Sync Keys act. This means to process the keyFetchToken – which is received once the user logs in – (I had to use a restmail email to test this, apparently requests from test emails don’t get blocked by the server), and derive the tokenID and reqHMACkey that will be further used in the request to the account/keys endpoint. The server then responds with a hex bundle that undergoes multiple cryptographic operations to obtain the sync keys. This phase also gave me the opportunity to rethink some portions of my code and provide a better transparency between the crypto module and the service module.

Until I find a way to make login work again, there is nothing much I can do to advance my project. Hopefully, I’ll figure out how to implement the alternate login method, so I can proceed to establish the communication with the storage server.

GSoC 2016: Progress #1

This post may come a bit late, but I’ve been pretty much hooked to my end of semester finals over the last few weeks. However, exams are gone now so I can start focusing on my summer project. Accordingly, I’ve been working hard these last days to catch up with my timeline before the midterm.

For those of you who are not aware of my project, I’m working on the Web: Session Sync project, aiming to introduce a bookmarks sync feature for Epiphany with the help of Mozilla’s servers. As I’ve mentioned in my previous post, the first part of my project involves establishing a working communication with the Firefox Accounts Server. This implies implementing the client side of the onepw protocol in Epiphany a.k.a. being able to compute the protocol’s tokens, send both normal requests and Hawk requests to the server and derive the sync keys.

These being said, here are the things I’ve completed so far:

  • created different modules to divide my project’s functionality: a sync service (the core module, initiating every server-related action and invoking other modules), a cryptographic module (for computing and deriving the protocol’s tokens and generating Hawk requests headers), a secret module (for storing/retrieving encrypted tokens to/from disk – see below), and an utils module (for different utility functions).
  • implemented the PBKDF2 and HKDF algorithms used to derive the authentication tokens from the user’s email and password. One handy tool for this task was Nettle, a low-level cryptographic library, that proved very useful, having great APIs for hmac, sha256, pbkdf and many others.
  • implemented functionality for keeping the protocol’s tokens secret by encrypting them on disk with the help of libsecret. The idea behind this is that most of the tokens are persistent, meaning they won’t expire until the user logs out and the session is destroyed. Hence, once the user has logged in, the Sync Service will store the computed/retrieved tokens so they will be loaded directly from there for further use cases.
  • implemented functionality for generating headers for the Hawk requests. Hawk is an HTTP authentication scheme providing mechanisms for making authenticated HTTP requests with partial cryptographic verification of the request and response. This was totally new to me but, fortunately, I was able to peek at Mozilla’s Python Hawk library on GitHub and understand how to properly generate a Hawk header.
  • added a new Sync tab to the Preferences dialog. This is the place where users should be able to log in with their Firefox Account in order to start the syncing process. As soon as the Login button is pressed, the Sync Service will stretch the user’s email and password and a subsequent call to the account/login endpoint will be issued, sending the previously computed tokens to the FxA Server. If the server validates the request, the sync service will proceed to compute the sync keys from the server’s response (this part I have yet to implement), otherwise a suggestive error message will be displayed.

sync-tab

That would be it for now. The next thing I’m planning to implement is the request to the account/keys endpoint along with the derivation of the sync keys from the response. I hope this post has shed some light on what I’m actually doing for my GSoC project!

See you the next weeks 🙂

GSoC 2016: Introduction

My name is Gabriel Ivașcu and I am a third year student at University Politehnica of Bucharest pursuing a Bachelor’s degree in Computer Science.

Ever since I joined university and got introduced to Linux I immediately became an avid fan of free and open source software, my passion evolving constantly since then. I’ve been using GNOME under Fedora as my standard desktop environment for about two years now.

I started contributing to GNOME in October 2015 with the help of my colleague and friend, Iulian Radu, a former participant of GSoC 2015 program and also a current participant in the 2016 edition. I worked mainly on Nibbles, Iulian’s 2015 project, and, later in February, we also began contributing to Web (a.k.a. Epiphany) under the guidance of Michael Catanzaro, our current mentor, who was kind enough to accept mentoring two projects this year.

Regarding my project, the ultimate goal is to introduce a session sync feature to GNOME Web. Session sync is an already present feature in many other popular web browsers such as Firefox, Chrome and Safari, which Epiphany lacks. As stated on the ideas page, the aim is to provide a simple and easy way for users to synchronize their bookmarks, history, saved passwords and open tabs between multiple devices.

Between Mozilla and OwnCloud, together with my mentor we concluded that Mozilla is a better option because it would allow users to seamlessly sync data with Firefox with only a Firefox account, no configuration required (OwnCloud would require the users to set up their own server). Yet Mozilla imposes a rather complex protocol that FxA clients need to use in order to communicate with the FxA server. Since implementing this protocol would require a considerable amount of work that I probably won’t have enough time left to finish all forms of sync (bookmarks, history, passwords, open tabs) my mentor agreed that I should focus on getting the protocol work and only implement bookmarks sync for the moment. Having a working form of bookmarks sync as reference, adding the extras at a later time would be relatively easy.

Due to my exams session between May 28 and June 17 I intend to start my work earlier during the Community Bonding period so I can cover the time lost with my unavailability during my exams.

I am very excited to be part of this and I hope good things will come out. Looking forward to the summer!