Today I Learned

A Hashrocket project

206 posts about #javascript

All Jest Describes Run First

When writing a Jest test, the setup code inside any top-level describe function gets run before any scenario. So, if you setup a world in one describe, and a competing world in another, the last one that is run wins.

This even applies when you’ve marked a describe as ‘skip’ via xdescribe. The code inside that setup will still be run.

This is a first pass at a TIL, because I’m not sure the behavior I’m seeing is expected, and if so, why it is expected. It certainly surprised me 😳. I think one solution is to limit your tests to one top-level describe which has the important benefit of being more readable, too.

Fix poor type support in immutablejs

Immutablejs doesn’t work very well with typescript. But I can patch types myself for some perceived type safety:


import {Map,fromJS} from 'immutable';

interface TypedMap<T> extends Map<any, any> {
  toJS(): T;
  get<K extends keyof T>(key: K, notSetValue?: T[K]): T[K];
}

interface User {
  name: string;
  points: number;
}

const users = List() as List<TypedMap<User>>;

users.forEach(u => {
  const aString = u.name
  const aNumber = u.points
});

Typescript Bang

The “non-null assertion operator” or “!” at the end of an operand is one way to tell the compiler you’re sure the thing is not null, nor undefined. In some situations, the compiler can’t determine things humans can.

For example, say we have a treasure map with some gold in it.

type TreasureMap = Map<string, number>

const map: TreasureMap = new Map()
const treasure = 'gold'
map.set(treasure, 10)

And we’re cranking out some code that tells us if

function isItWorthIt(map: TreasureMap) { 
  return map.has(treasure) && map.get(treasure) > 9
}

Obviously we are checking if the map has gold first. If it doesn’t, the execution returns early with false. If it does, then we know we can safely access the value.

But the compiler gets really upset about this:

Object is possibly ‘undefined’. ts(2532)

In this case, the compiler doesn’t keep map.has(treasure) in context when evaluating map.get(treasure). There is more than one way to solve this, but for now we can simply “assert” our superiority over the compiler with a bang.

return map.has(treasure) && map.get(treasure)! > 9

Typscript Docs - non-null-assertion-operator

Enzyme debug() 🐞

Debugging React unit tests can be tricky. Today I learned about the Enzyme debug() function. Here’s the signature:

.debug([options]) => String

This function:

Returns an HTML-like string of the wrapper for debugging purposes. Useful to print out to the console when tests are not passing when you expect them to.

Using this in a log statement will dump a ton of valuable data into your test runner’s output:

console.log(component.debug());

debug() docs

`isNaN` vs `Number.isNaN` (hint: use the latter)

Chalk this up to JavaScript is just weird. The isNaN function returns some surprising values:

> isNaN(NaN)
true

> isNaN({})
true

> isNaN('word')
true

> isNaN(true)
false

> isNaN([])
false

> isNaN(1)
false

What’s going on here? Per MDN, the value is first coerced to a number (like with Number(value)) and then if the result is NaN it returns true.

Number.isNaN is a little bit more literal:

> Number.isNaN(NaN)
true

> Number.isNaN({})
false

> Number.isNaN('word')
false

> Number.isNaN(true)
false

> Number.isNaN([])
false

> Number.isNaN(1)
false

So, if you really want to know if a value is NaN use Number.isNaN

I learned about this via Lydia Hallie’s Javascript Questions

Destructure into an existing array

This one’s got my head spinning. Let’s say you have an existing array:

const fruits = ['banana', 'apple', 'kumquat']

You can destructure right into this array.

{name: fruits[fruits.length]} = {name: 'cherry'}

//fruits is now ['banana', 'apple', 'kumquat', 'cherry']

Generally, I would think of the {name: <some var id>} = ... syntax as renaming the value that you are destructuring, but now I think of it more as defining a location that the value will be destrcutured to.

If you try to declare a new variable in the same destructuring however you will get an error if you use const:

const {name: fruits[fruits.length], color} = {name: 'cherry', color: 'red'}
// Uncaught SyntaxError: Identifier 'fruits' has already been declared

Or the new variable will go onto the global or window object if you don’t use const:

{name: fruits[fruits.length], color} = {name: 'cherry', color: 'red'}
global.color
// 'red'

Output directories in Parcel v1 and Parcel v2

Parcel stated a nice piece of philosophy in their v2 README.

Instead of pulling all that configuration into Parcel, we make use of their own configuration systems.

This shows up in the difference in how output directories are handled in v1 and v2.

In v1, dist is the default output directory and can overridden with the -d flag for the build command, like this:

npx parcel build index.js
// writes to dist/index.js
npx parcel build index.js -d builds
// writes to builds/index.js

In v2, parcel reads the main key from your package.json file and uses that to configure the output path and file.

With the configuration:

// package.json
{
...
"main": "v2_builds/index.js"
...
}

parcel outputs to the specified location.

npx parcel build index.js
// writes to v2_builds/index.js

Production mode tree shaking in webpack by default

I’ve been experimenting with noconfig webpack (version 4.43) recently and was pleased to see tree shaking is on by default.

If I have a module maths.js:

export const add = (a, b) => a + b;

export const subtract = (a, b) => a - b;

And in my index.js file I import only add:

import { add } from './maths'

Then when I run webpack in production mode with -p, choose to display the used exports with --display-used-exports and choose to display the provided exports with --display-provided-exports then I get an output for the maths module that indicates tree shaking is taking place:

$ npx webpack -p --display-used-exports --display-provided-exports
    | ./src/maths.js 78 bytes [built]
    |     [exports: add, subtract]
    |     [only some exports used: add]

The output [only some exports used: add] indicates that subtract has not been included in the final output.

Prefer lodash-es when using webpack

The lodash package needs to be able to support all browsers, it uses es5 modules. The lodash-es package uses es6 modules, allowing for it to be tree shaked by default in webpack v4.

This import declaration:

import {join} from 'lodash';

brings in the entire lodash file, all 70K.

This import declaration:

import {join} from 'lodash-es';

brings in just the join module, which is less than 1K.

With both lodash builds you can just import the function directly:

import join from 'lodash/join';

But when using multiple lodash functions in a file you may prefer the previous import declarations to get it down to one line:

import {chunk, join, sortBy} from 'lodash-es';

If you have these declarations throughout your app, consider aliasing lodash to lodash-es in your webpack config as a quick fix.

Include vs. Includes 🤷‍♂️

A hallmark of switching back and forth between Ruby and JavaScript: I mistype the respective array inclusion function a lot. In JavaScript, it’s includes, and in Ruby, it’s include (with a question mark). Each language does not care what I meant.

The way I remember which function to use: in Ruby, I say: “does this array include this item?”. Then when I’m writing JavaScript, I remember I’m not writing Ruby. This technique is… lacking.

Chris Erin’s pneumonic is better: “JavaScript has an ‘s’ in it, so it includes. Ruby does not, so it is include.”

AddEventListener doesn't duplicate named functions

When adding an event listener to a target in javascript, using the EventTarget.addEventListener function allows you to add a listener without clobbering any others that may be registered to that target. This is really nice and all, but if you have a script that may or may not be reloaded multiple times you can end up with this function getting called quite a bit. If you’re like me and you use anonymous functions like they’re going out of style, then you can end up attaching multiple handlers, because they aren’t recognized as the same function, eg

const div = document.createElement('div');
div.addEventListener('click', event => (fetch('mutate/thing'));

This could end up running the fetch the same number of times that the script is loaded. If instead you do:

function mutateMyThings(event) { fetch('mutate/thing') };
div.addEventListener('click', mutateMyThings);

Then you don’t have to worry about it, since adding a named function will automatically only add it once.

Smooth animations with Math.sin()

Remember geometry class? Well, I don’t.

But I do know that when you’re making a synthesizer, a sine wave produces a nice smooth sound.

And as it turns out, a sine wave has a smooth curve. So, you can use Math.sin() to animate smooth motions like so:

const domNode = document.querySelector('#waveEmoji')
function animate() {
  const a = Math.sin(Date.now() / speed) * range + offset;
  domNode.style = `transform: rotate(${a}deg);`;
  requestAnimationFrame(animate);
}
animate();

Edit Sine Wave

Variable arguments and map in JS can hurt

Given some array of numbers as strings:

const someNumStrings = ['1', '2', '05', '68'];

If you want them to be numbers, then you might be tempted to do something like:

someNumStrings.map(parseInt)

Which would be fine if parseInt didn’t allow multiple arguments, and if map only sent in the single element. But that’s not how it works, so what you end up getting is a bit of a mess.

[1, NaN, 0, NaN]

The parseInt function takes a radix as the second argument (and realistically anything else you want to pass to it won’t cause it to explode). The Array.map method takes a callback (in this case parseInt) and gives that little sucker all the data you could want! In this case, the index is being passed as the radix, and parseInt doesn’t care about the others.

TL;DR: map(el => parseInt(el)) >>>>>>>>>> map(parseInt) and if you ever intentionally encode the radix as the index of the element you’re parsing… may god have mercy on your soul.

Import Absolute Paths in Typescript Jest Tests

In order to avoid this:

// project/__tests__/stuff/someDistantCousin.test.ts
import { thing } from '../../src/stuff/someDistantCousin'
import { wrapFunction } from '../testUtils/firebase'

And to write this instead:

import { thing } from 'src/stuff/someDistantCousin'
import { wrapFunction } from 'tests/testUtils/firebase'

There are 2 things to configure:

  1. Jest Config (i.e. jest.config.js, package.json, etc…)
  2. Typscript Config: tsconfig.json

jest.config.js

module.exports = {
  moduleNameMapper: {
    'src/(.*)': '<rootDir>/src/$1',
    'tests/(.*)': '<rootDir>/__tests__/$1',
  },
}

tsconfig.json

{
  "compilerOptions": {
    "baseUrl": "./",
    "paths": {
      "src/*": ["src/*"],
      "tests/*": ["__tests__/*"]
    }
  }
}

Do not prefix TypeScript interface names

TLDR: If you’re writing TypeScript, you’re using intellisense, and it will yell at you if you try to use an interface as a variable. So there is no need anymore for naming conventions like IMyInterface or AMyAbstractClass.

When I first started using TypeScript in 2018 I saw a lot of people’s code prefixing interfaces with “I”:

interface ICar {
    color: string
    horsepower: number
}

function drive(car: ICar) { /*...*/ }

However, that is a convention borrowed from the past (other languages). It used to be mentioned in the TypeScript handbook to not prefix, but that generated more argument than resolution. The handbook code examples, however, do not use prefixes.

I believe modern dev environments have relieved us from the need to include type and contextual information into the names of many code things, not just interfaces.

Now we get proper type and context from things like intellisense and language servers, whether it’s inferred or strictly typed. This frees us to have names for things that can be more descriptive of the processes and data in question.

Control your magic strings in Firebase projects

Firebase’s real-time database JavaScript API makes it really easy to wind up with a ton of magic strings in your codebase. Since you access data via “refs” like:

firebase.database.ref(`customers/1`).update({ name: 'Mike' })

In that example, “customers/1” is a magic string that points to some data in the database. Magic strings are well known as something to avoid in software development. And here we are updating a database with one, yikes!

These can easily be managed via patterns. I’ve been abstracting these into their own modules in an “API-like” pattern. For example:

// api/customers.ts
import { protectedRef } from 'protected-ref'

export const rootRef = 'customers'
export const getCustomerRef = (customerID: string) => protectedRef(rootRef, customerID)
export const updateCustomer = (customerID: string, updateDocument: CustomerUpdateDocument) => getCustomerRef(customerID).update(updateDocument)

And then use it like:

import { updateCustomer } from '../api/customers'

updateCustomer('1', { name: 'Mike' })

Also protected-ref is a firebase package that manifested from this TIL: https://til.hashrocket.com/posts/hyrbwius3s-protect-yourself-against-firebase-database-refs

Browsers have a Web Cryptography API

All major versions of browser implement a Web Cryptography API for obtaining random numbers:

const numbers = new Uint32Array(1);
window.crypto.getRandomValues(numbers);
console.log(numbers);
// => [1627205277]

However, the methods are overridable, not read-only, and are vulnerable to polyfill attacks. This shouldn’t be used yet and is still being developed, but I found out the API exists.

JavaScript "falsy" values are still values

Having good defaults doesn’t mean always having them in the function signature. I try to provide good defaults for my function args when possible. I like to provide them in the signature, too. However, I ran into a case where doing it in an object destructure would have caused a bug.

This function receives an object (fruit) and returns its price based on its selectedRate. It is expected that selectedRate can be '', null, or undefined, in addition to constants like "standardRate". In any case that the selected rate doesn’t exist, standardRate should be used:

function getFruitPrice({ selectedRate = 'standardRate', ...fruit }) {
  //...
}

The code above doesn’t satisfy this requirement. JavaScript considers empty strings and null to be values even though they are falsy. So, we can’t rely on setting the default this way. Instead a better implentation might look like this:

function getFruitPrice(fruit) {
  const rateToUse = fruit.selectedRate || 'standardRate'
  // ...
}

Protect Yourself Against Firebase Database Refs!

My worst nightmare is to lose production data, thank God for backups. But my nightmare came to life on a project’s Firebase database. This is a noSQL JSON-like document database offered as a service from Google Cloud Platform. You use it like this:

firebase.database.ref('/customers/1')

This is how you get the data at that path. Like a file system.

And when you want to make it dynamic:

firebase.database.ref(`/customers/${customerID}`)

Cool.

But, what if customerID is blank?

cue Nightmare Before Christmas soundtrack

funtion handleDelete(customerID) {
    firebase.database.ref(`/customers/${customerID}`)
    // ...delete()
}

If I ran that function in production with a blank customer ID, well then I just deleted all the customers.

Fear Not!

import protectedRef from './protectedRef.js'

const customerID = ''
protectedRef('customers', customerID) 
// => [ERROR] Bad ref! 1: ""

Use protection kids.

// protectedRef.js

import { sanitize } from './sanitize.js'

function protectedRefReducer(pathSoFar, pathPart, index) {
  const sanitizedPathPart = sanitize(pathPart)


  if (!sanitizedPathPart) {
    throw new Error(`Bad ref! ${index}: "${pathPart}"`)
  }

  return pathSoFar + '/' + sanitizedPathPart
}

export default function protectedRef(...parts) {
    if (!parts.length) {
    throw new Error('noop')
  }
  return firebase.database().ref(parts.reduce(protectedRefReducer, ''))
}

Use Typescript to help migrate/upgrade code

I am tasked with migrating a large javascript codebase from using the beta version of firebase-functions to the latest. Like most major upgrades, there are many API changes to deal with. Here’s an example cloud function:

Beta version with node 6:

exports.dbCreate = functions.database.ref('/path').onCreate((event) => {
  const createdData = event.data.val(); // data that was created
});

Latest version with node 10:

exports.dbCreate = functions.database.ref('/path').onCreate((snap, context) => {
  const createdData = snap.val(); // data that was created
});

The parameters changed for onCreate.

In the real codebase there are hundreds of cloud functions like this and they all have varying API changes to be made. With no code-mod in sight, I’m on my own to figure out an effecient way to upgrade. Enter Typescript.

After upgrading the dependencies to the latest versions I added a tsconfig:

{
  "compilerOptions": {
    "module": "commonjs",
    "outDir": "lib",
    "target": "es2017",
    "allowJs": true,
    "checkJs": true,
  },
  "include": ["src"]
}

The key is to enable checkJs so that the Typescript compiler reports errors in javascript files.

And running tsc --noEmit against the codebase provided me with a list of 200+ compiler errors pointing me to every change required.

Use assigned variables on its own assign statement

Javascript allows you to use an assigned variables into the same assignment statement. This is a non expected behavior for me as the first languages I’ve learned were compiled languages and that would fail compilation, at least on the ones I’ve experienced with.

Here’s an example:

Javascript ES2015:

const [hello, world] = [
  () => `Hello ${world()}`,
  () => "World",
]
hello()
//=> "Hello World"

In this example I’m assigning the second function into the world variable by using destructuring assignment and I’m using this by calling world() in the first function.

Ruby:

To my surprise ruby allows the same behavior:

hello, world = [
  lambda { "Hello #{world.call}" },
  lambda { "World" },
]
hello.call
#=> "Hello World"

Elixir:

Well, Elixir will act as I previously expected, so assigned variables are allowed to be used only after the statement that creates them.

[hello, world] = [
  fn -> "Hello #{world.()}" end,
  fn -> "World" end,
]
hello.()
#=> ERROR** (CompileError) iex:2: undefined function world/0

I won’t start using this approach, but it’s important to know that’s possible.

Javascript arguments on ES2015 Arrow functions

Javascript function arguments can be accessed by functions defined using the function keyword such as:

function logArgsES5 () {
  console.log(arguments)
}
logArgsES5('foo', 'bar')
// => Arguments(2) ["foo", "bar"]

But ES2015 Arrow functions does not bind this variable, so if you try this you will see an error:

let logArgsES2015 = () => {
  console.log(arguments)
}
logArgsES2015('foo', 'bar')
// => Uncaught ReferenceError: arguments is not defined

So if we want to have similar variable we can add an ...arguments as the function argument:

let logArgsES2015 = (...arguments) => {
  console.log(arguments)
}
logArgsES2015('foo', 'bar')
// => Array(2) ["foo", "bar"]

FormData doesn't iterate over disabled inputs

If I have a form that looks like this:

<form>
  <input disabled name="first_name" />
  <input name="last_name" />
</form>

And then I use FormData to iterate over the inputs.

const form = new FormData(formElement);

const fieldNames = [];

form.forEach((value, name) => {
  fieldNames.push(name);
});

// fieldNames contains ['last_name']

Then fieldNames only contains one name!

Be careful if you’re using FormData to extract names from a form!

Set Splits Strings

Examples of JavaScript’s Set usually include an array as the argument:

const arraySet = new Set([1, 2, 3, 4, 5]);

However, the Set constructor also accepts a string, splitting it into a set for you:

const text = 'India';

const stringSet = new Set(text);  // Set ['I', 'n', 'd', 'i', 'a']

Enjoy those extra keystrokes!

docs

Tuples are never inferred in TypeScript

If I have a function that takes a number tuple:

type Options = {
  aspect: [number, number],
  ...
}

const pickImage = (imageOptions: Options) => (
  ...
)

This will give a type error:

const myOptions = {
  aspect: [4, 3],
};

// ❌ Expected [number, number], got number[]
pickImage(myOptions);

I must use type assertion when passing tuples:

const myOptions = {
  aspect: [4, 3] as [number, number],
};

pickImage(myOptions);

Firefox Built-in JSON Tools ⛏

Recent versions of Firefox (I’m on 67.0.1) have built-in tools to explore JSON.

Here’s a screenshot of a JSON file from the React Native docs, viewed in Firefox:

image

We get syntax highlighting, the ability to save, copy, expand, or collapse the JSON, right in the browser, with no extra plugins or tools required. Another great Firefox feature!

Jest toEqual on js objects

jest toEqual compares recursively all properties of object with Object.is. And in javascript an object return undefined for missing keys:

const obj = {foo: "FOO"}

obj.foo
// "FOO"

obj.bar
// undefined

So it does not make sense to compare an object against another one that has a undefined as value, so remove them all from the right side of the expectation to avoid confusion. In other words, this test will fail:

test("test fails => objects are similar", () => {
  expect({
    planet: "Mars"
  }).not.toEqual({
    planet: "Mars",
    humans: undefined
  });
});

// Expected value to not equal:
//   {"humans": undefined, "planet": "Mars"}
// Received:
//   {"planet": "Mars"}

Watch out!

Readable code with objects, constants & attributes

Occasionally when working with external APIs, you need to send over some cryptic status:

postData(
  `http://example.com/orders/123`,
  { status: 2 }
)

But what does 2 mean?!

We can use objects to make this a bit easier to maintain

const STATUS = { processing: 0, error: 1, completed: 2 }

postData(
  `http://example.com/orders/123`,
  { status: STATUS.completed }
)

Now we can immediately see the order status is completed.


This technique is applicable in other languages as well:

Ruby with a constant:

STATUS = { error: 0, processing: 1, completed: 2 }
STATUS[:completed]

Elixir as a module attribute

defmodule Order do
  @status %{error: 0, processing: 1, completed: 2}

  def completed, do: @status.completed
end

Alphabetize Keys with jq

In a previous post, I wrote about how jq is great because it doesn’t alphabetize JSON keys by default. I still think that’s great, because sometimes the key order is meaningful, such as might be found in a package.json file.

We can add alphabetization to jq, however, using the -S flag. To format and and sort you current buffer in Vim, run the following:

:%!jq -S '.'

Happy JSON-ing!

Prevent npm high level security errors in CI

In npm 6.6, a feature was added to provide security audit information for the packages that are used in your application.

This is run with:

node audit

This exits with a non-zero exit code if any ‘low’, ‘medium’, ‘high’, or ‘critical’ errors were detected.

You can use that non-zero return code in your CI to fail a check, which should notify you of the security vulnerability which you can then resolve.

If you care about ‘high’ or ‘critical’ errors but don’t care about ‘low’ or ‘medium’ you can set the audit-level npm config value to ‘high’ in you npm configuration for your CI server.

Truncate an Array

Every time I go to truncate and array in JavaScript, I have to look up the syntax. It’s something that makes me angry.

Did you know that there is an oddly easier way to do it without reaching for splice() or slice()?

const collection = [4,6,9,1,12,42];
collection.length = 3;

collection is now [4,6,9]

Hopefully I can remember this one!

Get the return type of a function in TypeScript

Sometimes you only want to rely on the return type of a function instead of defining a new type for it.

To do that in TypeScript use ReturnType and typeof:

function extractStatusData(buffer: Buffer) {
  return {
    active: buffer[0] !== 1,
    totalErrors: buffer[1]
  }
}

function onStatusChange(callback: (data: ReturnType<typeof extractStatusData>) => void) {
  listenToSomeBinaryStuff(buffer => {
    const statusData = extractStatusData(buffer)
    callback(statusData)
  });
}


onStatusChange(({active, totalErrors}) => {
  if (active) {
    console.log(`Task is still running`)
  } else {
    console.log(`Task is completed with ${totalErrors} errors`)
  }
})

The paste event in browsers

Browsers allow you to capture a paste event in a DOM element [1][2].

This event fires before any clipboard data is inserted into the document, which makes it ideal for manipulating the data and pasting the manipulated data instead.

In Today I Learned by Hashrocket we recently utilized this feature to enable pasting images straight into the post editor. The image is then uploaded to imgur.com and the resulting URL is pasted as a Markdown Image Tag into the textbox.

image

If you are interested in adding similar functionality to your site check out this PR.

Also consider hosting your own fork of TIL.

Install node modules in subdir without a CD

So yea you don’t need a friend to send you a CD-ROM anymore to install node modules. You can use the World Wide Web! Pretty exciting.

On a more serious note, if your front-end is stored in a sub directory of your project, for example:

❯ tree -d -L 2
.
├── assets
|   ├── package.json
|   └── package-lock.json
└── app

Normally you would cd into assets, run npm i, and cd back out to the project root so you can go back to doing important things:

cd assets
npm i
cd -

Or you can push and pop, cause you’re kinda smart

pushd assets && npm i && popd

Or if you’re a really cool kid you’d use a sub-shell

(cd assets && npm i)

But wait! There’s another option - use the prefix CLI option in NPM!

npm i --prefix assets

It works really well for those long Dockerfile RUN statements.

for...in Iterates Over Object Properties

I don’t reach for for loops very often, so when I needed one recently I thought I’d check out the newer for...in construct. It didn’t behave quite how I was expecting. I thought it would iterate over the values in the target list, instead it seemed to be iterating over the indices.

The MDN docs explain what is going on:

The for...in statement iterates over all non-Symbol, enumerable properties of an object.

An array is an object whose properties are the indices of the values stored in the array.

const fruits = ["apple", "banana", "orange"];
for (let fruit in fruits) {
  console.log(fruit, fruits[fruit]);
}
// => "0" "apple"
// => "1" "banana"
// => "2" "orange"

The iteration value wasn’t what I was looking for, but in this case I can use it to access the value from the list. I’d be better off using a standard for loop though.

Imports in ES5

The ES6 import keyword is so pervasive in how we write JavaScript currently that when when I had to “import” the path package in ES5 I didn’t know how to do it!

It’s really easy though, and if you don’t like magic keywords maybe it’s a little more intuitive too.

const path = require('path');

And if something is the default export in it’s module, then you can use the default property.

const Something = require('something').default;

I ran into this in a file that was outside the build path, gatsby-config.js.

Destructured Access To Nested Value & Parent Value

A destructuring pattern that I often see (especially in React code) is to peel off a nested value in the argument declaration of a function.

const Component = ({ data: { name: displayName }}) => {
  return (
    <div>
      <h1>{displayName}</h1>
      <SubComponent />
    </div>
  );
};

On its own this works quite well, but what happens when you need access to the full set of data as well as the nested name value? I often see this.

const Component = ({ data }) => {
  const { name: displayName } = data;
  return (
    <div>
      <h1>{displayName}</h1>
      <SubComponent data={data} />
    </div>
  );
};

ES6 destructuring is flexible. You can skip the const line and keep everything in the argument declaration.

const Component = ({ data: { name: displayName }, data }) => {
  return (
    <div>
      <h1>{displayName}</h1>
      <SubComponent data={data} />
    </div>
  );
};

You can re-reference the data value after the nested destructuring.

Matching A Computed Property In Function Args

The computed property name feature of ES6 allows you to reference a variable in object assignments and destructurings. This syntax is flexible enough that it can be used in the arguments portion of a function declaration. In fact, it can even be matched against another argument — allowing the creation of some handy, yet terse functions.

const get = (key, { [key]: foundValue }) => foundValue;

Notice that the first argument, key, will match against the computed property name in the second argument. The foundValue will correspond to whatever key maps to in the given object.

This get function can then be used like so.

const stuff = { a: 1, b: 2, c: 3 };

console.log("Get a:", get("a", stuff)); // Get a: 1
console.log("Get d:", get("d", stuff)); // Get d: undefined

h/t @sharifsbeat