Today I Learned

A Hashrocket project

How to convert JSON to CSV with jq

I had this json file that was an array of objects and some of those objects had different keys. I wanted to visualize that data in a spreadsheet (data science, AI, machine learning stuff) so I thought about having a CSV file where each JSON key would become a CSV column.

// file.json
[
  {
    "type": "Event1",
    "time": 20
  },
  {
    "type": "Event2",
    "distance": 100
  }
]

jq to the rescue:


cat file.json | jq -r '(map(keys) | add | unique) as $cols | map(. as $row | $cols | map($row[.])) as $rows | $cols, $rows[] | @csv' > file.csv

Here is the output:

"distance","time","type"
,20,"Event1"
100,,"Event2"
Screen Shot 2020-08-07 at 4 39 16 PM

Typescript Bang

The “non-null assertion operator” or “!” at the end of an operand is one way to tell the compiler you’re sure the thing is not null, nor undefined. In some situations, the compiler can’t determine things humans can.

For example, say we have a treasure map with some gold in it.

type TreasureMap = Map<string, number>

const map: TreasureMap = new Map()
const treasure = 'gold'
map.set(treasure, 10)

And we’re cranking out some code that tells us if

function isItWorthIt(map: TreasureMap) { 
  return map.has(treasure) && map.get(treasure) > 9
}

Obviously we are checking if the map has gold first. If it doesn’t, the execution returns early with false. If it does, then we know we can safely access the value.

But the compiler gets really upset about this:

Object is possibly ‘undefined’. ts(2532)

In this case, the compiler doesn’t keep map.has(treasure) in context when evaluating map.get(treasure). There is more than one way to solve this, but for now we can simply “assert” our superiority over the compiler with a bang.

return map.has(treasure) && map.get(treasure)! > 9

Typscript Docs - non-null-assertion-operator

Count Occurrences in Elixir

Elixir recently introduced an useful couple of functions to count how many times a value appears in an Enumerable. It comes in two formats: frequencies/1 and frequencies_by/2. Here’s an example:

iex> [
...>   %{name: "Falcon", power: "Flight"},
...>   %{name: "Titan Spirit", power: "Flight"},
...>   %{name: "Atom Claw", power: "Strength"},
...>   %{name: "Electro", power: "Electricity Control"},
...>   %{name: "Loki Brain", power: "Telekinesis"},
...> ] |> Enum.frequencies_by(& &1.power)
%{
  "Electricity Control" => 1,
  "Flight" => 2,
  "Strength" => 1,
  "Telekinesis" => 1
}

Generic React Components

When props are generic like this:

inteface SelectProps<T> {
  options: T[];
  onChange: (value: T) => void;
}

function CoolSelect<T> (props: SelectProps<T>) {
    // ...
}

The generic part can be specified in JSX like this:

interface Fruit {
  name: string;
  isFruit: boolean
}

const fruits = [
  { name: 'Pumpkin', isFruit: true },
  { name: 'Avocado', isFruit: true },
  { name: 'Cucumber', isFruit: true },
  { name: 'Bell Pepper', isFruit: true },
]

funciton App() {
  return <CoolSelect<Fruit> options={fruits} />
}

See it? <CoolSelect<Fruit> options={fruits} />

Now when crafting the onChange function in this example, it’s type will be infered as this:

type OnChange = (value: Fruit) => void
funciton App() {
  return (
    <CoolSelect<Fruit>
      options={fruits}
      onChange={value => {
        if (value.isFruit && value.name === 'Bell Pepper') {
          console.log("You're blowing my mind dude!")
        }
      }}
    />
  )
}

*This syntax is available in Typescript v2.9+

Git + NPM: Resolving Lockfile Conflicts 🤝

Here’s a challenging real-world scenario: you’re doing a big merge or rebase on a JavaScript project, and you keep getting conflict after conflict in your package-lock.json. These conflicts are tough to resolve, because your package-lock.json is not easy to read, and is, say, 30,000 lines long. What do you do?

When you hit to conflict, on the the conflicting Git SHA, run the following command. Your lockfile will be regenerated, conflict resolved:

$ npm install --package-lock-only

💥

From the resolving lockfile conflicts docs:

image

Tmux Send Keys to Pane

I wrote a script the other day designed to help me download and edit files faster. In part of the script, I wanted to open Vim in an existing Tmux pane, and in the process I learned about the tmux send-keys command. Here’s how it works:

$ tmux send-keys -t 3 "vim" Enter

send-keys, aliased send, sends your string of commands to your pane (t) of choice. Running the above opens Vim in pane #3.

The -t flag accepts negative numbers, too, like an array index. In my version of the above command, I send the keys to pane -1, the last pane on the screen, which is where I keep Vim in my Tmux session.

Use Enzyme's `wrappingComponent` option in `mount`

Also works with shallow:

const provided = {super: 'cool', object: ['of', 'things']};

// This means that the root of the tree is going to be the provider
describe('Some Component', () => {
  it('does cool things when props change', () => {
    const target = mount(
      <CoolProvider thing={provided}
        <Component changeMe={false} />
      </CoolProvider>
  })
})

// This means that the root of the tree will be your Component
describe('Some Component', () => {
  it('does cool things when props change', () => {
    const target = mount(<Component changeMe={false} />, {
      wrappingComponent: CoolProvider,
      wrappingComponentProps: {super: 'cool', object: ['of', 'things']}
  })
})

This pattern is particularly meaningful when you want to call setProps or other methods that are only valid on the root of the component tree, because dive can’t help you there. If you want to change the props on both, you can use target.getWrappingComponent() to get at the wrapping component in the same way!

https://enzymejs.github.io/enzyme/docs/api/ReactWrapper/getWrappingComponent.html

Enzyme debug() 🐞

Debugging React unit tests can be tricky. Today I learned about the Enzyme debug() function. Here’s the signature:

.debug([options]) => String

This function:

Returns an HTML-like string of the wrapper for debugging purposes. Useful to print out to the console when tests are not passing when you expect them to.

Using this in a log statement will dump a ton of valuable data into your test runner’s output:

console.log(component.debug());

debug() docs

Postgres Identity Column

The Postgres wiki recommends not using the serial type, and instead added identity columns to replace them.

Old way:

create table todos (
  id bigserial primary key,
  todo text not null
);

The new way with identity columns:

create table todos (
  id bigint generated by default as identity primary key,
  todo text not null
);

Data:

insert into todos (todo) values
  ('write a til'),
  ('get some coffee');

select *
from todos;
 id |      todo
----+-----------------
  1 | write a til
  2 | get some coffee
(2 rows)

Source: PG wiki: Don’t use serial

Turning off a specific linter w/ALE

Ale comes configured with a set of default linters for each filetype it might encounter.

For typescript, if eslint is available as an executable, eslint will run, lint your code and display the results in vim. To turn off eslint for typescript you can set the variable g:ale_linters_ignore in your vimrc like this:

let g:ale_linters_ignore = {
      \   'typescript': ['eslint'],
      \}

Currently, I’m going through the typescript exercism track and I want to be able to play around a little bit with the syntax and would prefer not having a typescript linter at the moment.

Give git config more context with `--show-scope`

git config is only as helpful as the options you pass.

In the simplest instance I only get a value:

> git config user.email
dev@example.com

If I pass the --get-regexp flag I get the key and the value for all the instances of that key:

> git config --get-regexp user.email
user.email computer@example.com
user.email dev@example.com

If I pass the --show-scope flag (added in 2.26) I get the scope:

> git config --show-scope --get-regexp user.email
global user.email computer@example.com
local  user.email dev@example.com

If I pass the --show-origin then I also get the file where the key was configured:

> git config --show-scope --show-origin --get-regexp user.email
global file:/home/dev/.gitconfig user.email computer@example.com
local  file:.git/config user.email dev@example.com

The git blog is an incredible way to learn about the new functionality in each git release.

`isNaN` vs `Number.isNaN` (hint: use the latter)

Chalk this up to JavaScript is just weird. The isNaN function returns some surprising values:

> isNaN(NaN)
true

> isNaN({})
true

> isNaN('word')
true

> isNaN(true)
false

> isNaN([])
false

> isNaN(1)
false

What’s going on here? Per MDN, the value is first coerced to a number (like with Number(value)) and then if the result is NaN it returns true.

Number.isNaN is a little bit more literal:

> Number.isNaN(NaN)
true

> Number.isNaN({})
false

> Number.isNaN('word')
false

> Number.isNaN(true)
false

> Number.isNaN([])
false

> Number.isNaN(1)
false

So, if you really want to know if a value is NaN use Number.isNaN

I learned about this via Lydia Hallie’s Javascript Questions

Throttle iOS simulators' network speed

Get MacOS’s Network Link Conditioner.prefPane.

  1. Sign in at apple: https://developer.apple.com/download/more/?=Additional%20Tools%20for%20Xcode
  2. Download “Additional Tools for Xcode [version]” for your Xcode version.
  3. Run the DMG
  4. Open Hardware and double-click Network Link Conditioner.prefPane to install.

Use it:

  1. Open Settings then Network Link Conditioner
  2. Adjust the Profile and toggle the switch ON

Remember to turn it off :)

Destructure into an existing array

This one’s got my head spinning. Let’s say you have an existing array:

const fruits = ['banana', 'apple', 'kumquat']

You can destructure right into this array.

{name: fruits[fruits.length]} = {name: 'cherry'}

//fruits is now ['banana', 'apple', 'kumquat', 'cherry']

Generally, I would think of the {name: <some var id>} = ... syntax as renaming the value that you are destructuring, but now I think of it more as defining a location that the value will be destrcutured to.

If you try to declare a new variable in the same destructuring however you will get an error if you use const:

const {name: fruits[fruits.length], color} = {name: 'cherry', color: 'red'}
// Uncaught SyntaxError: Identifier 'fruits' has already been declared

Or the new variable will go onto the global or window object if you don’t use const:

{name: fruits[fruits.length], color} = {name: 'cherry', color: 'red'}
global.color
// 'red'

Check for members in Ruby

Ruby’s Enumerable class has a member? method that returns a boolean.

For arrays, the method checks if the supplied value is included (similar to ['a'].include?('a')):

[:a, :b].member?(:b) # => true
[:a, :b].member?(:c) # => false

For hashes, the method checks if the supplied value is included in the keys for the hash:

{ a: 'b' }.member?(:a) # => true
{ a: 'b' }.member?(:c) # => false

Supercharge Your Script with psql -c 🥞

Want to execute a PostgreSQL command from the command line? You can! The --command or -c flag takes a string argument that will be executed on your database of choice.

I’ve been using it as part of a script that creates a remote database backup, downloads the backup, drops and creates a local database, dumps the database backup into the local database, and then runs a select statement on the dataset. That final command looks like this (query has been simplified):

$ psql -d tilex_prod_backup -c "select count(*) from posts";

 count
-------
  2311
(1 row)

Repeat subs with `g&` and `:&` and `:&&`

The tricks from Vim Un-Alphabet keep coming. Repeating your substitution is a cool trick and you can do it one of 3 ways.

g& will repeat the last substitution you did, but for the whole file, whatever file you’re in at the moment.

:& will repeat the last substitution on the line you are on but you get to change the flags if you want. So you can decide now to make it global with :&g.

:&& will repeat the last substitution on the line you are on with the flags you used for that substitution.

3 very nice tricks to smooth out your workflow when making substitutions.

Inline MD codeblock with double backticks

A pair of single backticks signify an inline code block in Markdown. But if you need to put a backtick in that code block the backtick ends the codeblock!

Another way to signify an inline codeblock is with double backticks, two backticks in sequence on either side of the codeblock.

When using double backticks for a codeblock, a single backtick within the codeblock will be interpreted as just that, not the end of the codeblock.

Jump to the last place you were in INSERT mode

I’ve known for a while that you can jump to the last place you edited with the gi command but it’s always been slightly annoying to me that gi places you into INSERT mode.

To get back to the same place, but in NORMAL mode you can use the ^ mark by typing `^. This mark is reset everytime you leave edit mode. You can see what that mark is set to with :marks ^ Shoutout to Josh Branchaud and his Vim Un-Alphabet series for teaching me a new vim trick!

zsh comes with help (set the $HELPDIR)

zsh helpfully comes installed with help files for all the builtins and a run-help command to help you access those help files. There is a trick though, before setting any environment variables here’s what happens:

$ run-help
There is no list of special help topics available at this time.

This is because the HELPDIR isn’t set. You have to find the install location for zsh’s help files and set the env var to that dir. On my system that looks like this:

export HELPDIR='/usr/share/zsh/help'

Then when you run run-help you should see a list of builtins for which there is help documentation. This is the same documentation that you can get via man builtins but much more readable and discoverable. run-help will call man as well if it can’t find you’re arg in the help files.

For me run-help is cumbersome to type so I alias it. Here’s what goes into my .zshrc:

export HELPDIR='/usr/share/zsh/help'
alias help=run-help

zsh is now much more helpful!

zsh comes with Tetris

zsh comes with it’s very own tetris game. No plugins needed!

You do need to autoload the tetriscurses function:

autoload -Uz tetriscurses

And then run tetriscurses.

While in the game, you can press H to learn which keys do what, and that looks like this:

left: h, j, left
right: right, n, l
rotate: up, c, i
soft drop: down, t, k
hard drop: space
quit: q
press space to return

also, maybe you want an alias for this?

autoload -Uz tetriscurses
alias tetris=tetriscurses

I’m putting the above straight into my .zshrc! Happy Sunday!

Follow the link in linux with `readlink -e`

Sometimes linux can be a maze of symbolic links. On my system, the java command exists at /usr/bin/java which is a link that points to /etc/alternatives/java which is a link that points to /usr/lib/jvm/java-8-oracle/jre/bin/java.

Instead of looking up each of the links of these files with ls -l, readlink -e will the links all the way through to the eventual file. In my case that would look like this:

$ readlink -e `which java`
# returns /usr/lib/jvm/java-8-oracle/jre/bin/java

You can learn more with man readlink.

On Mac, there is a readlink command, but there is no -e flag and it is not recursive.

Elixir Date and Time conversion into US standards

Today I learned how to manually convert a Date and Time into US standards: mm/dd/YYYY and hh:mm am|pm. Here’s my code snippet:

defmodule Utils.Converter do
  def to_usa_date(%Date{day: day, month: month, year: year}) do
    "~2..0B/~2..0B/~4..0B"
    |> :io_lib.format([month, day, year])
    |> to_string()
  end

  def to_usa_time(%Time{} = time) do
    period = (time.hour < 12 && "am") || "pm"
    hour = time.hour |> rem(12)

    "~2..0B:~2..0B ~2..0s"
    |> :io_lib.format([hour, time.minute, period])
    |> to_string()
  end
end

This way I can convert dates like ~D[2020-05-29] into 05/29/2020" and times like ~T[11:00:07.001] into 11:00 am" and ~T[23:00:07.001] into 11:00 pm".

Here’s the erlang :io_lib documentation

Choosing your `--cloud` provider on Gigalixir

One cool thing about gigalixir is that you can choose both cloud platform and the datacenter/region where you’d like to install your app.

By default gigalixir create <app name> will put an app in Google Cloud Platform.

But what if you already have infrastructure that you want to take advantage of on AWS in the us-east-1 region (this is where heroku is by default putting it’s servers)?

Well you can change providers with the --cloud flag and the region with --region flag.

gigalixir create -n gigatilex --cloud aws --region us-east-1

AWS it is!

Transfer env vars from Heroku to Gigalixir

We’re in the process of moving tilex, this website, to gigalixir so that we can use http/2. One part of that is moving over all the configuration. Turns out that’s real easy with the -s flag.

heroku config -s -a tilex | xargs gigalixir config:set

With the -s flag, you get this:

> heroku config -s | grep DATABASE_URL
DATABASE_URL=postgres://user:pass@location/dbname

Rather than this

> heroku config | grep DATABASE_URL
DATABASE_URL: postgres://user:pass@location/dbname

When means you can just pipe all those configurations over to gigalixir using xargs.

The heroku command assumes there is a git remote named heroku and the gigalixir command assumes there is a git remote named gigalixir.

Global Alias in zsh

What makes an alias global? Well, the -g flag of course. And what does this globality give you? Well, the ability to invoke an alias anywhere in the command line.

If I like the word ‘Potateos’ but I don’t ever have the energy to type the whole thing then I can create a global alias for that word:

> alias -g PO="Potatoes"
> echo PO
Potatoes

That’s convenient and cool. What is it actually for? Maybe redirecting errors to /dev/null:

> alias -g NO='2> /dev/null'
> echo foo >> /dev/stderr
foo
> (echo foo >> /dev/stderr) NO
# no output, it got swallowed!

Looks weird and maybe not useful, but maybe you can creatively find a useful way to use it:

I learned about this zsh functionality and other functionality here.

Shortcuts with hash -d in zsh

I stumbled across this zsh tricks post yesterday and am blown away by the hash command, which allows you to see and manipulate the hash table for either commands or for directory shortcuts.

hash by itself in zsh will output the location for all the commands.

hash -d shows you all of the named directories, and check this out you can navigate to one of those directories with ~shortcutname, like this:

$ hash -d | grep bin
bin=/bin
daemon=/usr/sbin
proxy=/bin
sync=/bin
$ cd ~daemon
$ pwd
/usr/sbin

You can create your own directory shortcuts like this:

$ hash -d mydir=/home/me/very/long/path

And then cd to it:

$ cd ~mydir
$ pwd
/home/me/very/long/path

Crazy! Read more in the zsh docs.

List Files by Updated

I’m currently working on an app that forwards logging around to various locations on the Linux server. It’s a bit tricky for me to figure out where any action I take in the browser is being logged. I need those logs!

A nice way to figure out where the logging is happening is to narrow it down to one directory (say, /var/log/) then ls that directory, ordering by most recently updated. The items at the top of the list have been recently updated, and thus probably contain valuable loggings!

$ ls -lt

I’m throwing on the -l flag for more detail. If there are lot of logs, filter it down with head:

$ ls -lt | head

Thanks for the idea, Kori!

Rails `travel_to` changes the value of `usec` to 0

https://github.com/rails/rails/blob/4dcc5435e9569e084f6f90fcea6e7c37d7bd2b4d/activesupport/lib/active_support/testing/time_helpers.rb#L145

In and of itself, not a huge deal. However, when you combine it with the way that active record quotes times:

https://github.com/rails/rails/blob/4dcc5435e9569e084f6f90fcea6e7c37d7bd2b4d/activesupport/lib/active_support/testing/time_helpers.rb#L145

It can lead to some incredibly subtle bugs in your tests that use queries in their assertions. Long story short, if your query relies on a time comparison, you may not return anything. :|

Base64 Encode Your Images

This is a companion to my previous post about inline HTML styles. If you want to put your images inline in your HTML, GNU coreutils provides the base64 program.

Here’s me running it in Vim command mode, sending the output right to my current line:

:.! base64 -w 0 public/images/bg.gif

The -w 0 disables line wrapping, which I don’t want for inline images. Prefix this string with "data:image/<MIME TYPE>;base64," and you’re good to go.

What Styles are Being Used?

I like my exceptional pages (404, 500, 503) to rely as little as possible on the webserver to render in the browser. A nice tool to support this is CSS Used for Chrome.

CSS Used adds a panel to DevTools that reports which CSS classes from external stylesheets are being used on your page. You can then paste these styles inside a style tag and remove stylesheet references in your HTML. I iterate on this by opening the HTML file in my browser outside of my project repo; when all the styles are loading, I’ve removed the stylesheet dependency.

image

Word navigation when underscores are in the word

Phil Capel posted a til recently that talked about using the _ character as a word boundary by removing it from the iskeyword list with:

:se iskeyword-=_

So now w navigates to the next underscore in long_id_for_var and you can copy long with yiw when your cursor is on long.

My addendum to this is that navigating with W will still go to the next space separated word, B will go the beginning of the previous space separated word, and if your cursor is on long, yiW will copy long_if_for_var.

Use the word under the cursor with Ctrl-R Ctrl-A

Everybody at Hashrocket has some solution for searching for the word under the cursor.

Some people created a mapping, but as I try to keep to native vim functionality as much as possible I copied the current word with yiw and then typed:

:Rg <C-R>0

Where <C-R>0 writes whatever is in register 0 to the command.

Instead, the command mapping <C-R><C-A> writes the word currently under the cursor to the command, so I can just skip the yiw.

:Rg <C-R><C-A>

Will search for the word under the cursor.

See :help c_CTRL-R_CTRL-A for more info.

The three amigos of the current directory

I always have trouble remembering how to get the name of the current directory. So strange pneumonics is the way to go.

The first amigo is a shell variable:

echo $PWD
# returns '/home/chris/tils'

There is also a pwd command that returns the same thing.

The second amigo is basename which gives you the current directory name without its path:

basename $PWD
# returns 'tils'

The third amigo is dirname which gives you the path without the current directory name:

dirname $PWD
# returns '/home/chris'

So now I can do things like

alias tnew=tmux new -s $(basename $PWD)

because I always, always, name my tmux session after the name of the current directory.

Output directories in Parcel v1 and Parcel v2

Parcel stated a nice piece of philosophy in their v2 README.

Instead of pulling all that configuration into Parcel, we make use of their own configuration systems.

This shows up in the difference in how output directories are handled in v1 and v2.

In v1, dist is the default output directory and can overridden with the -d flag for the build command, like this:

npx parcel build index.js
// writes to dist/index.js
npx parcel build index.js -d builds
// writes to builds/index.js

In v2, parcel reads the main key from your package.json file and uses that to configure the output path and file.

With the configuration:

// package.json
{
...
"main": "v2_builds/index.js"
...
}

parcel outputs to the specified location.

npx parcel build index.js
// writes to v2_builds/index.js

DNS Lookup with host

Today while doing some sleuthing, I learned about the host command. host “is a simple utility for performing DNS lookups.” It helped me connect a series of domains to their respective AWS EC2 servers, without a visit to the domain registrar.

Example:

$ host jakeworth.com
jakeworth.com has address 184.168.131.241
jakeworth.com mail is handled by 10 mailstore1.secureserver.net.
jakeworth.com mail is handled by 0 smtp.secureserver.net.

More info: man host

Parcel hot module reloading over ssh

A really great feature of parcel is hot module reloading, or rather, when you make a change to a file that change is manifested in the browser where you have your app open.

My parcel project was setup on a remote server that I had ssh’d into port forwarding the default parcel port, 1234, but in the console of Firefox I got this error:

Firefox can’t establish a connection to the server at ws://localhost:41393/. hmr-runtime.js:29:11

Turns out I needed to lock my hmr port to a fixed port and then port forward that hmr port over ssh.

Lock the hmr port when you start the parcel server like this:

parcel index.html --hmr-port=55555

And make sure you’re port forwarding that port over ssh:

ssh chris@myserver -L 1234:localhost:1234 -L 55555:localhost:55555

Now your app will establish a web socker connection to the parcel server!