React + SSR + NoScript + CSS in JS Fallbacks

Custom CSS for users with JS disabled.

What does that title even mean?

Assuming you have a site:

  • Using CSS in JS.
  • Using SSR (we’re specifically using emotion, which is very similar to styled-components).
  • That you’d like to work for users with JS disabled.

Then it means this is the article you need to read.

The specific site I was solving this issue for was created with Gatsby, which has all of the above setup by default.

The Scenario

You have some images which you would like to initially be hidden. They will then fade in when the user scrolls to them.

This can be implemented like this:

import React from 'react'
import VisibilitySensor from 'react-visibility-sensor'
import styled from 'react-emotion'
import { css } from 'emotion'

const hiddenStyles = css`
  opacity: 0;
  transform: translate(0px, 60px) scale(1.05, 1.05);
`

const visibleStyles = css`
  opacity: 1;
  transform: translateX(0px) translateY(0px) translateZ(0px) scaleX(1) scaleY(1)
    scaleZ(1);
`

const Container = styled('div')`
  transition: width 0.7s ease 0s, opacity 1200ms, transform 1800ms;
  ${hiddenStyles};
  ${({ visible }) => visible && visibleStyles};
`

class EnterAnimation extends React.Component {
  state = {
    visible: false,
  }

  onChange = visible => {
    if (visible && !this.state.visible) {
      this.setState({ visible: true })
    }
  }

  render() {
    return (
      <VisibilitySensor partialVisibility onChange={this.onChange}>
        <Container visible={this.state.visible}>
          {this.props.children}
        </Container>
      </VisibilitySensor>
    )
  }
}

export default EnterAnimation

The EnterAnimation class wraps children and shows them with a css animation when it is scrolled into view (using react-visibility-sensor to detect this).

The problem with this code is that when Gatsby extracts our CSS it will extract our hidden styles as the default. This means users with JS disabled will not be able to see any elements wrapped in this component.

To get around this problem we can create specific styles for users with JS disabled.

1. Add a no-js class on <html>.

We’re using react-helmet to add attributes to our html. We can use the Helmet component to add a default no-js class to html. It’s important that this is only added during SSR or else it will break styles for users with JS enabled. Checking the typeof window allows us to determine if we’re doing SSR or not.

const IS_SSR = typeof window === 'undefined'

<Helmet>
    <html lang="en" className={IS_SSR ? 'no-js' : 'js'} />
</Helmet>
)

2. Remove no-js with a <script> in <head>.

We add a script in <head> to remove the no-js class (thanks to Paul Irish for this one-liner). Since it’s a script it will only be removed for users who don’t have JS disabled, and it won’t run during SSR.

<Helmet
  title={`${title} | ${siteTitle}`}
  meta={[{ name: 'description', content: siteDescription }]}
  script={[
    {
      type: 'text/javascript',
      innerHTML:
        "document.documentElement.className = document.documentElement.className.replace(/\\bno-js\\b/,'js');",
    },
  ]}
>
  <html className="no-js" />
</Helmet>

3. Add a no-js specific style.

To add a no-js specific style we’ll use emotion.

const Container = styled('div')`
  transition: width 0.7s ease 0s, opacity 1200ms, transform 1800ms;
  ${hiddenStyles};
  html.no-js & {
    ${visibleStyles};
  }
  ${({ visible }) => visible && visibleStyles};
`

We target the <html> element when it has the no-js class, then use & to target our specific component. This style will be included in SSR, but will only be enabled when the no-js class is present. For our JS users the class will be removed before the first render.

That’s about all there is to it. Users without JS don’t make up a large part of most audiences, but when the tooling makes it easy to support them why not make an attempt?

🚩 FastImage

Performant React Native image component.

Download on GitHub

Law of the Instrument

Why you'll never use this in real life.

I suppose it is tempting, if the only tool you have is a hammer, to treat everything as if it were a nail.Abraham Maslow

This is referred to as The Law of the Instrument. It follows that if you come across a screw you’d better have a screwdriver, or at least you should be aware that you should use one.

I'm never going to use this in real life.People
  • You’re probably right.
  • You’re probably going to do a half ass job of learning it.
  • When the day comes where you could solve a problem using this:

    • You might not remember this.
    • You might not realize you should use this.
    • You might not be skilled enough to use this.
    • You might take a more basic approach than using this, resulting in an inferior solution.
    • You might just avoid solving the problem altogether.
    • You’ll probably give up and do something more in line with your knowledge and skill level.

What I’m saying is: Maybe you should have payed more attention during linear algebra class.

Good news, it’s not too late: MIT OCW Linear Algebra.


This is directed at myself as much as it is anyone else.

To reach our potential we have to push ourselves to study more advanced methods, and we have to try to apply them to solving problems. This could mean studying formal cs, calculus, linear algebra, compiler design, low-level languages, etc.

The Railer

UOIT Capstone Project

Detecting a stud and nailing the plywood sheet down.

The goal of the project was to create a robot that could nail down plywood sheeting on roofs.

The final product wasn’t very polished. It was easy to think of how everything should work, it was much more difficult to actually build it and work out all the real world details.

Components

  • Raspberry PI to control all the other components.
  • Motors with encoders to perform turns and go straight using PID control.
  • Ultrasonic sensors to sense the roof edge and align with it.
  • Stud finder to sense the joists.
  • Linear actuator to press down the nail gun.
  • Solenoid to trigger the nail gun.
  • Coded mostly in python, using pigpio for things needing better performance like the encoders.

Aligning with the Edge of the Roof

Preparing to put in the next row of nails. Turn, forward, turn, align, forward.

Using only the encoders to get ready for the next row would have led to an accumulation of error in the robots angle. A routine had to be created to align the robot with the edge.

Aligning routine states. The black dots are ultrasonic sensors. The line is the edge of the roof.

We used some ultrasonic sensors and a simple state based program to align it with the edge of the roof. This solution was cheaper and simpler than our other options: computer vision or a local positioning system.

1 of 3
Next
© 2018 Dylan Vann | View Source