File1, 2, & 3 for rmitchell:



The bottom layer of the material, described April 3 at the Materials
Research Society spring meeting, features carbon nanotube pores embedded within
a flexible synethetic polymer film. These pores are just a few nanometers
across -- too small for bacterial or viral cells to squeeze through, but
wide enough for sweat to escape.

The top layers offers further protection. It is made of another,
spongy polymer that normally allows water and other molecules to pass
through. But when the polymer comes into contact with G-series nerve
agents -- the family of toxic chemicals that includes sarin gas --
it flattens into a dense sheet that seals over the carbon nanopores
underneath. The polymer can be restored to its orginal state by soaking it
in a high-pH chemical broth.

Both layers are still less than half the thickness of a sheet of
paper, and could be laid over fabrics without putting the wearer at risk
of overheating. Thats an improvement over the typical protective gear
that's permanently sealed against contaminants, said study coauthor Francesco
, a chemical engineer.


When you've got leverage, don't be afraid to use it. That's been
Google's modus operandi in the news and publishing world over the last
year or so it has pushed its AMP platform, funding various news-related
ventures that may put it ahead,and nourished its personalized Chrome tabs
on mobile. The latter, as Nieman Labs notes, grew 2,100 percent in 2017.

You may have noticed, since Chrome is a popular mobile browser and
this setting is on by default, but the "Articles for You" appear
automatically in every new tab, showing you a bunch of articles the company
thinks you'd like. And it's gone from driving 15 million article views to
a stagering 341 million over the last year.

In late 2016, when Google announced the product, I described it as
"polluting" the otherwise useful tab page. I also don't like the idea of
being served news when I'm not actively looking for it -- I understand
that when I visit Google News (and I do) that my browser history
(among other things) is being scoured to determine which categories and
stories I'll see. I also understand that everything I do on the site, as
on every Google site, is being entered into its great data engine in order
to improve its profile of me.


Red flags and "disputed" tags just entreched people's views about
suspicious news articles, so Facebook is hoping to give readers a wide
array of info so they can make their own decisions about what's
misinformation. Facebook will try showing links to a journalist's
Wikipedia entry, other articles, and a follow button to help users make up
their mind about whether they're a legitimate source of news. The test
will show up to a subset of users in the U.S. when users click on the
author's name within an Instant Article if the author's publisher has
implemented Facebook's author tags.

Meanwhile, Facebook is rolling out to everyone in the U.S. its
test from October that gives readers more context about publications by
showing links to their Wikipedia pages, related articles about the same
topic, how many times the article has been shared and where, and a button
for following the publisher within an "About This Article" button.
Facebook will also start to show whether friends have shared the article,
a snapshot of the publisher's other recent articles.

These moves are designed to feel politically neutral to prevent
Facebook from being accused of bias. After former contractors reported
that they suppressed conservative Trending topics on Facebook in 2016,
Facebook took a lot of heat for supposed liberal bias. That caused it to
hesitate when fighting fake news before the 2016 Presidential selection...
and then spend the next two years dealing with the backlash for allowing
misinformation to run rampant.

No lines are longer than 80 characters, TYVM. Other specified properties aren't being scored automatically at this time so this is not necessarily good news...