This may look like a photograph, but the highly realistic face staring back at you belongs to a man who died over 700 years ago. The researchers who performed this unbelievable facial reconstruction say their work is providing new details about the way ordinary people lived in medieval England.
Fictiv is a rapid prototyping company that can take concepts or finished designs and farm them out to a network of CNC and 3D printing companies to have your design fabricated, finished and delivered within 24 hours; to demonstrate their new open IoT platform, they've announced an open-source hardware IoT motorcycle kit that you're meant to be able to assemble in your garage in a weekend, and drive off on by Monday. (more…)
Netflix and Marvel’s Iron Fist is not good TV. It is bad and boring TV with terrible fight scenes and a lead actor who comes from the “petulant grimace” school of acting. The only reason to watch Iron Fist is so you can be prepared for The Defenders, the epic series that will cross Iron Fist over with the casts of …
An anonymous reader quotes a report from Ars Technica: Some Kansas City residents who have been waiting years for Google Fiber to install service at their homes recently received e-mails canceling their installations, with no word on whether they'll ever get Internet service from the company. KSHB 41 Action News in Kansas City, Missouri, "spoke to several people, living in different parts of the metro, all who have recently received cancellation e-mails," the station reported last week. "The e-mails do not provide a specific reason for the cancellations. Instead they say the company was 'unable to build our network to connect your home or business at this time.'" While Google Fiber refuses to say how many installations have been canceled, KSHB said, "there is speculation the number of cancellations in the metro is as high as 2,700." "The company says it has slowed down in some areas to experiment with new techniques," such as wireless technology, the report also said. Google Fiber is still hooking up fiber for some new customers in parts of the Kansas City area. One resident who had his installation canceled is Larry Meurer, who was seeing multiple Google Fiber trucks in his neighborhood nearly two years ago, in the spring of 2015. "I'm left wondering what's going on," he told KSHB after getting the cancellation e-mail. Meurer lives in Olathe, Kansas, one of the largest cities in the Kansas City metro area. Residents only five houses away and around the corner have Google Fiber service, the report said. But Meurer said he and several neighbors who never got service were "terminated."
Read more of this story at Slashdot.
Near the end of Muppet Guys Talking, puppeteer Dave Goelz is asked a question about where he’s found the nobility in his life’s work. “[In] folly,” he answers. “Human folly. Celebrating the degree to which we’re all lost.” That’s just one instance of how this documentary about the world’s favorite puppets gets…
97 Things Every Programmer Should Know was published seven years ago by O'Reilly Media, and was described as "pearls of wisdom for programmers collected from leading practitioners." Today an anonymous reader writes: All 97 are available online for free (and licensed under a Creative Commons Attribution 3), including an essay by "Uncle Bob" on taking personal responsibility and "Unix Tools Are Your Friend" by Athens-based professor Diomidis Spinellis, who writes that the Unix tool chest can be more useful than an IDE. But the book's official site is also still accepting new submissions, and now points to 68 additional "edited contributions" (plus another seven "contributions in progress"), including "Be Stupid and Lazy" by Swiss-based Java programmer Mario Fusco, and "Decouple That UI" by tech trainer George Brooke. "There is no overarching narrative," writes the site's editor Kevlin Henney (who also wrote the original book). "The collection is intended simply to contain multiple and varied perspectives on what it is that contributors to the project feel programmers should know...anything from code-focused advice to culture, from algorithm usage to agile thinking, from implementation know-how to professionalism, from style to substance..."
Read more of this story at Slashdot.
100,000 people people have already downloaded an app that helps fight human trafficking. dryriver summarizes a report from CNN: Police find an ad for paid sex online. It's an illegally trafficked underage girl posing provocatively in a hotel room. But police don't know where this hotel room is -- what city, what neighborhood, what hotel or hotel room. This is where the TraffickCam phone app comes in. When you're staying at a hotel, you take pictures of your room... The app logs the GPS data (location of the hotel) and also analyzes what's in the picture -- the furniture, bed sheets, carpet and other visual features. This makes the hotel room identifiable. Now when police come across a sex trafficking picture online, there is a database of images that may reveal which hotel room the picture was taken in. "Technology drives everything we do nowadays, and this is just one more tool that law enforcement can use to make our job a little safer and a little bit easier," says Sergeant Adam Kavanaugh, supervisor of the St. Louis County Multi-Jurisdictional Human Trafficking Task Force. "Right now we're just beta testing the St. Louis area, and we're getting positive hits," he says (meaning ads that match hotel-room photos in the database). But the app's creators hope to make it available to all U.S. law enforcement within the next few months, and eventually globally, so their app is already collecting photographs from hotel rooms around the world to be stored for future use.
Read more of this story at Slashdot.
Here’s your cool sports hardware demo of the week: ShotTracker, the Kansas City-based startup whose team-focused offering can track and collect analytics from an entire basketball game in real-time, is demoing for all 31 games of the NAIA D1 Men’s National Championship tournament this week in Kansas City. It’s the first time automated, real-time stats have been available… Read More
We spend a lot of time and words on what autonomous cars can do, but sometimes it’s a more interesting question to ask what they can’t do. The limitations of a technology are at least as important as its capabilities. That’s what this little bit of performance art tells me, anyway. Read More
So I got some Wendy's today, and they have
little pop-up adventure sets as the toy. My daughter is 5yo, and she's not quite ready for full-blown RPGs just yet, but these are great little sets to do a little story and maybe roll some dice.
Pretty neat idea, and all you have to do is suffer through Wendy's food to get it!
hey! writes: The U.S. Office of Management and Budget has released a budget "blueprint" which outlines substantial cuts in both basic research and applied technology funding. The proposal includes a whopping 18% reduction in National Institutes of Health medical research. NIH does get a new $500 million fund to track emerging infectious agents like Zika in the U.S., but loses its funding to monitor those agents overseas. The Department of Energy's research programs also get an 18% cut in research, potentially affecting basic physics research, high energy physics, fusion research, and supercomputing. Advanced Research Projects Agency (ARPA-E) gets the ax, as does the Advanced Technology Vehicle Manufacturing Program, which enabled Tesla to manufacture its Model S sedan. EPA loses all climate research funding, and about half the research funding targeted at human health impacts of pollution. The Energy Star program is eliminated; Superfund funding is drastically reduced. The Chesapeake Bay and Great Lakes cleanup programs are also eliminated, as is all screening of pesticides for endocrine disruption. In the Department of Commerce, Sea Grant is eliminated, along with all coastal zone research funding. Existing weather satellites GOES and JPSS continue funding, but JPSS-3 and -4 appear to be getting the ax. Support for transfer of federally funded research and technology to small and mid-sized manufacturers is eliminated. NASA gets a slight trim, and a new focus on deep space exploration paid for by an elimination of Earth Science programs. You can read more about this "blueprint" in Nature, Science, and the Washington Post, which broke the story. The Environmental Protection Agency, the State Department and Agriculture Department took the hardest hits, while the Defense Department, Department of Homeland Security, and Department of Veterans Affairs have seen their budgets grow.
Read more of this story at Slashdot.
Last Wednesday, for no apparent reason, the TeamViewer remote desktop application stopped working on the network of one of the UK's largest ISPs, TalkTalk. The apparent reason, as the investigation has found, are some scammers in India who have been abusing the application to make money. An anonymous reader shares a report: It's a popular application with remote support professionals and power users alike and so support forums soon filled with complaints from perplexed users who noticed that access was possible with 4G and some TalkTalk business connections but not home broadband. By Thursday, journalists dragged the truth out of the company that it had "blocked a number of applications including TeamViewer," which led to a joint statement confirming this on TeamViewer's website: TeamViewer and TalkTalk are in extensive talks to find a comprehensive joint solution to better address this scamming issue. We now know (as some suspected at the time) that the block was connected to abuse of TeamViewer by criminals based in India who had been using it as part of a tech support scam targeting TalkTalk customers. The BBC reported on this two days before the block, including the disturbing claim that the criminals had been able to quote stolen customer account data to make scam calls sound more convincing.
Read more of this story at Slashdot.
These days, most large FLOSS communities have a "Code of Conduct"; a document that outlines the acceptable (and possibly not acceptable) behaviour that contributors to the community should or should not exhibit. By writing such a document, a community can arm itself more strongly in the fight against trolls, harassment, and other forms of antisocial behaviour that is rampant on the anonymous medium that the Internet still is.
Writing a good code of conduct is no easy matter, however. I should know -- I've been involved in such a process twice; once for Debian, and once for FOSDEM. While I was the primary author for the Debian code of conduct, the same is not true for the FOSDEM one; I was involved, and I did comment on a few early drafts, but the core of FOSDEM's current code was written by another author. I had wanted to write a draft myself, but then this one arrived and I didn't feel like I could improve it, so it remained.
While it's not easy to come up with a Code of Conduct, there (luckily) are others who walked this path before you. On the "geek feminism" wiki, there is an interesting overview of existing Open Source community and conference codes of conduct, and reading one or more of them can provide one with some inspiration as to things to put in one's own code of conduct. That wiki page also contains a paragraph "Effective codes of conduct", which says (amongst others) that a good code of conduct should include
Specific descriptions of common but unacceptable behaviour (sexist jokes, etc.)
The attentive reader will notice that such specific descriptions are noticeably absent from both the Debian and the FOSDEM codes of conduct. This is not because I hadn't seen the above recommendation (I had); it is because I disagree with it. I do not believe that adding a list of "don't"s to a code of conduct is a net positive to it.
Why, I hear you ask? Surely having a list of things that are not welcome behaviour is a good thing, which should be encouraged? Surely such a list clarifies the kind of things your does not want to see? Having such a list will discourage that bad behaviour, right?
Well, no, I don't think so. And here's why.
A list of things not to do is like a virus scanner. For those not familiar with these: on some operating systems, there is specific piece of software that everyone recommends you run, which checks if particular blobs of data appear in files on the disk. If they do, then these files are assumed to be bad, and are kicked out. If they do not, then these files are assumed to be not bad, and are left alone (for the most part).
This works if we know all the possible types of badness; but as soon as someone invents a new form of badness, suddenly your virus scanner is ineffective. Additionally, it also means you're bound to continually have to update your virus scanner (or, as the case may be, code of conduct) to a continually changing hostile world. For these (and other) reasons, enumerating badness is listed as number 2 in security expert Markus Ranum's "six dumbest ideas in computer security," which was written in 2005.
In short, a list of "things not to do" is bound to be incomplete; if the goal is to clarify the kind of behaviour that is not welcome in your community, it is usually much better to explain the behaviour that is wanted, so that people can infer (by their absense) the kind of behaviour that isn't welcome.
This neatly brings me to my next point...
The world isn't black-and-white. We could define a list of welcome behaviour -- let's call that the whitelist -- or a list of unwelcome behaviour -- the blacklist -- and assume that the work is done after doing so. However, that wouldn't be true. For every item on either the white or black list, there's going to be a number of things that fall somewhere in between. Let's call those things as being on the "gray" list. They're not the kind of outstanding behaviour that we would like to see -- they'd be on the white list if they were -- but they're not really obvious CoC violations, either. You'd prefer it if people don't do those things, but it'd be a stretch to say they're jerks if they do.
Let's clarify that with an example:
Is it a code of conduct violation if you post links to pornography websites on your community's main development mailinglist? What about jokes involving porn stars? Or jokes that denigrate women, or that explicitly involve some gender-specific part of the body? What about an earring joke? Or a remark about a user interacting with your software, where the women are depicted as not understanding things as well as men? Or a remark about users in general, that isn't written in a gender-neutral manner? What about a piece of self-deprecating humor? What about praising someone else for doing something outstanding?
I'm sure most people would agree that the first case in the above paragraph should be a code of conduct violation, whereas the last case should not be. Some of the items in the list in between are clearly on one or the other side of the argument, but for others the jury is out. Let's call those as being in the gray zone. (Note: no, I did not mean to imply that the list is ordered in any way
If you write a list of things not to do, then by implication (because you didn't mention them), the things in the gray area are okay. This is especially problematic when it comes to things that are borderline blacklisted behaviour (or that should be blacklisted but aren't, because your list is incomplete -- see above). In such a situation, you're dealing with people who are jerks but can argue about it because your definition of jerk didn't cover teir behaviour. Because they're jerks, you can be sure they'll do everything in their power to waste your time about it, rather than improving their behaviour.
In contrast, if you write a list of things that you want people to do, then by implication (because you didn't mention it), the things in the gray area are not okay. If someone slips and does something in that gray area anyway, then that probably means they're doing something borderline not-whitelisted, which would be mildly annoying but doesn't make them jerks. If you point that out to them, they might go "oh, right, didn't think of it that way, sorry, will aspire to be better next time". Additionally, the actual jerks and trolls will have been given less tools to argue about borderline violations (because the border of your code of conduct is far, far away from jerky behaviour), so less time is wasted for those of your community who have to police it (yay!).
In theory, the result of a whitelist is a community of people who aspire to be nice people, rather than a community of people who simply aspire to be "not jerks". I know which kind of community I prefer.
During one of the BOFs that were held while I was drafting the Debian code of conduct, it was pointed out to me that a list of things not to do may give the impression to people that all these things on this list do actually happen in the code's community. If that is true, then a very long list may produce the impression that the given community is a community with a lot of problems.
Instead, a whitelist-based code of conduct will provide the impression that you're dealing with a healthy community. Whether that is the case obviously depends on more factors than just the code of conduct itself, but it will put people in the right mindset for this to become something of a self-fulfilling prophecy.
Given all of the above, I think a whitelist-based code of conduct is a better idea than a blacklist-based one. Additionally, in the few years since the Debian code of conduct was accepted, it is my impression that the general atmosphere in the Debian project has improved, which would seem to confirm that the method works (but YMMV, of course).
At any rate, I'm not saying that blacklist-based codes of conduct are useless. However, I do think that whitelist-based ones are better; and hopefully, you now agree, too
My latest Publishers Weekly column announces the launch-date for my long-planned "Shut Up and Take My Money" ebook platform, which allows traditionally published authors to serve as retailers for their publishers, selling their ebooks direct to their fans and pocketing the 30% that Amazon would usually take, as well as the 25% the publisher gives back to them later in royalties. (more…)
Drawing on research in economics, psychology and sociology, study shows how people select their own reality by deliberately avoiding information that threatens their happiness and wellbeing and selectively direct attention to information that affirms what they believe or reflects favorably upon them [Published articles]
From towing kite propulsion to sails fitted with solar panels, modern engineers have been working hard to find ways to make our increasing reliance on big cargo shipping more energy efficient and environmentally friendly. Finnish company Norsepower has looked to the past for inspiration, finding a solution in a nearly century-old engineering innovation relegated to the annals of quirky mechanical history... Continue Reading Century-old rotorsail design gets modern makeover
Sandman artist Dave McKean has once again lent his talents to Neil Gaiman’s iconic works, illustrating a new edition of Gaiman’s supernatural novel American Gods, and it simply has to be seen to be believed.
Hugh writes, "These amazing animated shorts on physics feature an adorable, 1930's style version of Maxwell's Demon. There are 3 so far -- can't wait to see more!"
Congress has finally passed a bill authorizing NASA's new budget that gives the agency annual funding of$19.5 billion. The paperwork remains mostly unchanged from when it was passed in the Senate last December, with only minor alterations being made....
Here’s something you never want to see:
ZFS has detected a checksum error: eid: 138 class: checksum host: alexandria time: 2017-01-29 18:08:10-0600 vtype: disk
This means there was a data error on the drive. But it’s worse than a typical data error — this is an error that was not detected by the hardware. Unlike most filesystems, ZFS and btrfs write a checksum with every block of data (both data and metadata) written to the drive, and the checksum is verified at read time. Most filesystems don’t do this, because theoretically the hardware should detect all errors. But in practice, it doesn’t always, which can lead to silent data corruption. That’s why I use ZFS wherever I possibly can.
As I looked into this issue, I saw that ZFS repaired about 400KB of data. I thought, “well, that was unlucky” and just ignored it.
Then a week later, it happened again. Pretty soon, I noticed it happened every Sunday, and always to the same drive in my pool. It so happens that the highest I/O load on the machine happens on Sundays, because I have a cron job that runs zpool scrub on Sundays. This operation forces ZFS to read and verify the checksums on every block of data on the drive, and is a nice way to guard against unreadable sectors in rarely-used data.
I finally swapped out the drive, but to my frustration, the new drive now exhibited the same issue. The SATA protocol does include a CRC32 checksum, so it seemed (to me, at least) that the problem was unlikely to be a cable or chassis issue. I suspected motherboard.
It so happened I had a 9211-8i SAS card. I had purchased it off eBay awhile back when I built the server, but could never get it to see the drives. I wound up not filling it up with as many drives as planned, so the on-board SATA did the trick. Until now.
As I poked at the 9211-8i, noticing that even its configuration utility didn’t see any devices, I finally started wondering if the SAS/SATA breakout cables were a problem. And sure enough – I realized I had a “reverse” cable and needed a “forward” one. $14 later, I had the correct cable and things are working properly now.
One other note: RAM errors can sometimes cause issues like this, but this system uses ECC DRAM and the errors would be unlikely to always manifest themselves on a particular drive.
So over the course of this, had I not been using ZFS, I would have had several megabytes of reads with undetected errors. Thanks to using ZFS, I know my data integrity is still good.
The upcoming Quake Champions will be free-to-play. Or not. It depends on how you want to approach it. "At its core, it's a free-to-play game with the option to buy the Champion Pack and just get in and play with all the Champions," developer Bethesda...
Sometimes, the world can feel like a scary and confusing place. There are forces beyond our control. We feel helpless and small. Then, a Tyrannosaurus Rex plays Dungeons & Dragons, and suddenly, the world makes sense again.
Two unmanned probes lost in space have been located by NASA's Jet Propulsion Laboratory in Pasadena, California. Too small to be seen with optical telescopes, NASA's Lunar Reconnaissance Orbiter (LRO) and the Indian Space Research Organization's Chandrayaan-1 spacecraft were found by ground-based radar stations using a pioneering radar technique that could help in planning future missions to the Moon... Continue Reading NASA finds missing spacecraft using ground-based radar
Michael Scott, from Birmingham, went to photograph the caves after seeing a video of them online. He said: "I traipsed over a field to find it, but if you didn't know it was there you would just walk right past it. Considering how long it's been there it's in amazing condition, it's like an underground temple." The tunnel leads to a network of walkways and arches carved out of sandstone, as well as a font.
The cave is evidently a hot place to hang out if you're a witch. Be sure to ask the property owners nicely and clean up after the ritual is complete.
One year after Christmas, the labyrinth of intricately carved chambers was found to be filled with candles, sinister symbols scrawled on the walls and more besides.
The owners of the site, hidden in dense woodland ten miles from Wolverhampton, decided enough was enough when two warlocks knocked on the door – and asked for their robes back.
The red-faced pair had left the garments behind after a ritual.
Celebrate International Women's Day in stfnal style with Nevertheless She Persisted, a free anthology of original flash fiction by some of science fiction's leading women voices, from Catherynne M. Valente to Amal El-Mohtar to Jo Walton to Nisi Shawl to Charlie Jane Anders to Seanan McGuire to Alyssa Wong to Kameron Hurley -- and more! (more…)
The phone in your pocket can hold millions of times more information than a device the size of a fridge could decades ago, and for that we can thank continuous improvements to data storage density. Now, having created the world's smallest magnet, IBM has managed to store one bit of data in a single atom, in a breakthrough that could lead to storage devices that can hold 1,000 times more data in the same physical space as current HDDs... Continue Reading Single-atom magnet paves way for denser storage devices
Star Trek: Discovery's cast has slowly been coming together, but there's been one glaring omission: who's helming the show's namesake ship? At last, we know: say hello to Jason Isaacs, who will play the USS Discovery's Captain Lorca in the internet-f...
Back in 2014 over 3 million Internet users told the U.S. government loudly and clearly: we value our online security, we value our online privacy, and we value net neutrality. Our voices helped convince the FCC to enact smart net neutrality regulations—including long-needed privacy rules.
But it appears some members of Congress didn’t get the message, because they’re trying to roll back the FCC’s privacy rules right now without having anything concrete ready to replace them. We’re talking here about basic requirements, like getting your explicit consent before using your private information to do anything other than provide you with Internet access (such as targeted advertising). Given how much private information your ISP has about you, strict limits on what they do with it are essential.
Luckily, we can stop this train wreck before it happens. But we need your help: please call your senators and your representative right now and tell them to oppose any use of the Congressional Review Act (“the CRA”—they’ll know what it is) to roll back the FCC’s new rules about ISP privacy practices.
If you want more ammo for your conversation with congressional staff, read on. But if you’re already fired up, please click here to take action right now.
Together, we can stop Congress from undermining crucial privacy protections.
Late last year, the FCC passed rules that would require ISPs to protect your private information. It covered the things you would usually associate with having an account with a major company (your name and address, financial information, etc.) but also things like any records they keep on your browsing history, geolocation information (think cell phones), and the content of your communications. Overall, the rules were pretty darn good.
But now, Senator Flake (R-AZ) and Representative Blackburn (R-TN) want to use a tool known as a Congressional Review Act resolution to totally repeal those protections. The CRA allows Congress to veto any regulation written by a federal agency (like the FCC). Worse yet, it forbids the agency from passing any “substantially similar” regulations in the future, so the FCC would be forbidden from ever trying to regulate ISP privacy practices. At the same time, some courts have limited the Federal Trade Commission’s ability protect your privacy, too.
With the hands of two federal agencies tied, ISPs themselves would be largely in change of protecting their customer’s privacy. In other words, the fox will be guarding the henhouse.
If we seem a little insistent that you take action to stop this, that’s because we sincerely believe that together, we can stop this disaster before it comes to pass. Every time someone calls their representative or senators,
an angel gets its wings we’re one step closer to protecting the privacy of all U.S. Internet users. If we raise our voices the same way we did when it came to passing net neutrality, Congress won’t be able to ignore us.
So please, take action and call your senator and representative today, and tell them not to use the CRA to repeal the FCC’s privacy rules.
When you read a review for a product, you're usually looking for tangible qualities like battery life and performance. As we've seen lately, though, the company's respect for your data matters -- a seemingly perfect gift may turn out to be a privacy...
In response to a U.S. Justice Department order that requires colleges and universities make website content accessible for citizens with disabilities and impairments, the University of California, Berkeley, will cut off public access to tens of thousands of video lectures and podcasts. Officials said making the videos and audio more accessible would have proven too costly in comparison to removing them. Inside Higher Ed reports: Today, the content is available to the public on YouTube, iTunes U and the university's webcast.berkeley site. On March 15, the university will begin removing the more than 20,000 audio and video files from those platforms -- a process that will take three to five months -- and require users sign in with University of California credentials to view or listen to them. The university will continue to offer massive open online courses on edX and said it plans to create new public content that is accessible to listeners or viewers with disabilities. The Justice Department, following an investigation in August, determined that the university was violating the Americans With Disabilities Act of 1990. The department reached that conclusion after receiving complaints from two employees of Gallaudet University, saying Berkeley's free online educational content was inaccessible to blind and deaf people because of a lack of captions, screen reader compatibility and other issues. Cathy Koshland, vice chancellor for undergraduate education, made the announcement in a March 1 statement: "This move will also partially address recent findings by the Department of Justice, which suggests that the YouTube and iTunes U content meet higher accessibility standards as a condition of remaining publicly available. Finally, moving our content behind authentication allows us to better protect instructor intellectual property from 'pirates' who have reused content for personal profit without consent."
Read more of this story at Slashdot.
Pippin Barr's Snakisms is a version of the classic game Snake, but with a selection of philosophical viewpoints to choose from at the outset.
SNAKISMS was begun on the strength of the idea of "Ascetic Snake", a game of Snake in which the snake isn't meant to eat the apple (or whatever that thing is in Snake). That basic reversal of the standard form of the game struck me as funny because those sorts of things always strike me as funny, but on turning to actually make the game it seemed pretty clear it was too much of a throw-away idea all on its own.
And so it came to pass that I decided I needed to make a whole set of Snake games based (loosely) on different philosophies, eventually settling on the idea of "isms" because SNAKISMS is really a pretty great title for a game, I think you'll agree. The design process took a surprisingly long time in terms of coming up with a set of "reasonable" interpretations of philosophies/isms that could be translated in some way to the mechanics of the original Snake game.
The creator's Comp Sci PhD thesis concerns the moral dimensions of gameplay.
"A new technology is enabling everyday objects, such as posters and clothing, to be transformed into FM radio stations," reports The Stack, citing research from the University of Washington. An anonymous reader quotes their report. The team has introduced a technique called "backscattering" which uses ambient low-power radio signals to broadcast messages from random objects to smartphones in the local vicinity.The researchers hope that the development could help support various smart city applications, and picture a future where anything from a poster at a bus stop to a road sign can transmit audio updates and information to passers-by. During testing, the researchers were able to use the backscattering technique to create a "singing poster" which could send out the music of an advertised band to smartphone users at a distance of up to 4 meters and to cars in an 18-meter [59-foot] radius. "What we want to do is enable smart cities and fabrics where everyday objects in outdoor environments -- whether it's posters or street signs or even the shirt you're wearing -- can 'talk' to you by sending information to your phone or car," explained lead faculty and UW assistant professor of computer science and engineering Shyam Gollakota.
Read more of this story at Slashdot.
schwit1 quotes a report from ScienceAlert: Researchers have developed a technique that allows them to rapidly thaw cryopreserved human and pig samples without damaging the tissue -- a development that could help get rid of organ transplant waiting lists. Cryopreservation is the ability to preserve tissues at liquid nitrogen temperatures for long periods of time and bring them back without damage, and it's something scientists have been dreaming about achieving with large tissue samples and organs for decades. Instead of using convection, the team used nanoparticles to heat tissues at the same rate all at once, which means ice crystals can't form, so they don't get damaged. To do this, the researchers mixed silica-coated iron oxide nanoparticles into a solution and generated uniform heat by applying an external magnetic field. They then warmed up several human and pig tissue samples ranging between 1 and 50 mL, using either their new nanowarming technique and traditional slow warming over ice. Each time, the tissues warmed up with nanoparticles displayed no signs of harm, unlike the control samples. Afterwards, they were able to successfully wash the nanoparticles away from the sample after thawing. The team also tested out the heating in an 80 mL system -- without tissue this time -- and showed that it achieved the same critical warming rates as in the smaller sample sizes, suggesting that the technique is scalable. You can view a video of tissue being thawed out in less than a minute here. The research has been published in Science Translational Medicine.
Read more of this story at Slashdot.
mspohr writes: Eureka Magazine has a story about the latest NASA 2017-2018 software catalog. From the report: "NASA has released its 2017-2018 software catalogue free of charge to the public, without any royalty or copyright fees. This third edition of the publication has contributions from all the agency's centers on data processing/storage, business systems, operations, propulsion and aeronautics. It includes many of the tools NASA uses to explore space and broaden our understanding of the universe. 'The software catalogue is our way of supporting the innovation economy by granting access to tools used by today's top aerospace professionals to entrepreneurs, small businesses, academia and industry,' said Steve Jurczyk, associate administrator for NASA's Space Technology Mission Directorate (STMD) in Washington. 'Access to these software codes has the potential to generate tangible benefits that create jobs, earn revenue and save lives.'" Amazing amount of quality software... it IS rocket science. Further reading (and digesting): TechCrunch
Read more of this story at Slashdot.
NASA has just published its 2017-2018 software catalog, which lists the many apps, code libraries and tools that pretty much anyone can download and use. Of course, most of it is pretty closely tied to… you know, launching spacecraft and stuff, which most people don’t do. But here are a few items that might prove useful to tinkers and curious lay people alike. Read More
As a child, writer Lisa Hix visited Silver Dollar City, a surreal theme park in the Ozark Mountains that I have been fortunate enough to experience myself. Like me, Lisa was enchanted with the nutty dark ride Fire In The Hole and its story of people in creepy devil-horned hoods who torched a town. No, they weren't KKK members but rather the Bald Knobbers, a 19th century vigilante group. Over at Collectors Weekly, Lisa explores the history of the Bald Knobbers:
Though they never lit a town on fire—that part of the ride is completely invented—the real story of their rise is a terrifying parable about what happens when government fails and violence reigns. It’s a lesson that’s perhaps more relevant in the political climate of 2017 than Americans would like it to be.
When I called Dr. Matthew J. Hernando, a professor at Ozark Technical College and author of Faces Like Devils: The Bald Knobber Vigilantes in the Ozarks, he told me that “Fire in the Hole”—which he has ridden many times—“is basically a bunch of nonsense.” For the real story of the Bald Knobbers, Hernando explained, you have to look at southwest Missouri’s peculiar history. In a region where the Civil War had laid waste to the rule of law, ne’er do wells like the notorious James-Younger Gang and vigilante groups like the Bald Knobbers emerged to fill the void of authority. Admirers saw them as righteous folk heroes; adversaries regarded them as murderous thugs.
Please use the correct (perma)link to bookmark this article, not the page listing all wlog entries of the last decade. Thank you.</update>
Some updates inline and at the bottom.
The new Terms of Service of GitHub became effective today, which is quite problematic — there was a review phase, but my reviews pointing out the problems were not answered, and, while the language is somewhat changed from the draft, they became effective immediately.
Now, the new ToS are not so bad that one immediately must stop using their service for disagreement, but it’s important that certain content may no longer legally be pushed to GitHub. I’ll try to explain which is affected, and why.
I’m mostly working my way backwards through section D, as that’s where the problems I identified lie, and because this is from easier to harder.
Note that using a private repository does not help, as the same terms apply.
Section D.7 requires the person uploading content to waive any and all attribution rights. Ostensibly “to allow basic functions like search to work”, which I can even believe, but, for a work the uploader did not create completely by themselves, they can’t grant this licence.
The CC licences are notably bad because they don’t permit sublicencing, but even so, anything requiring attribution can, in almost all cases, not “written or otherwise, created or uploaded by our Users”. This is fact, and the exceptions are few.
Section D.5 requires the uploader to grant all other GitHub users…
Note that section D.4 is similar, but granting the licence to GitHub (and their successors); while this is worded much more friendly than in the draft, this fact only makes it harder to see if it affects works in a similar way. But that doesn’t matter since D.5 is clear enough. (This doesn’t mean it’s not a problem, just that I don’t want to go there and analyse D.4 as D.5 points out the same problems but is easier.)
This means that any and all content under copyleft licences is also no longer welcome on GitHub.
Some licences are famous for requiring people to keep the original intact while permitting patches to be piled on top; this is actually permissible for Open Source, even though annoying, and the most common LaTeX licence is rather close to that. Section D.3 says any (partial) content can be removed — though keeping a PKZIP archive of the original is a likely workaround.
Anything copyleft (GPL, AGPL, LGPL, CC-*-SA) or requiring attribution (CC-BY-*, but also 4-clause BSD, Apache 2 with NOTICE text file, …) are affected. BSD-style licences without advertising clause (MIT/Expat, MirOS, etc.) are probably not affected… if GitHub doesn’t go too far and dissociates excerpts from their context and legal info, but then nobody would be able to distribute it, so that’d be useless.
Only “continuing to use GitHub” constitutes accepting the new terms. This means that repositories from people who last used GitHub before March 2017 are excluded.
Even then, the new terms likely only apply to content uploaded in March 2017 or later (note that git commit dates are unreliable, you have to actually check whether the contribution dates March 2017 or later).
And then, most people are likely unaware of the new terms. If they upload content they themselves don’t have the appropriate rights (waivers to attribution and copyleft/share-alike clauses), it’s plain illegal and also makes your upload of them or a derivate thereof no more legal.
Granted, people who, in full knowledge of the new ToS, share any “User-Generated Content” with GitHub on or after 1ˢᵗ March, 2017, and actually have the appropriate rights to do that, can do that; and if you encounter such a repository, you can fork, modify and upload that iff you also waive attribution and copyleft/share-alike rights for your portion of the upload. But — especially in the beginning — these will be few and far between (even more so taking into account that GitHub is, legally spoken, a mess, and they don’t even care about hosting only OSS / Free works).
I’ll be starting to remove any such content of mine, such as the source code mirrors of jupp, which is under the GNU GPLv1, now and will be requesting people who forked such repositories on GitHub to also remove them. This is not something I like to do but something I am required to do in order to comply with the licence granted to me by my upstream. Anything you’ve found contributed by me in the meantime is up for review; ping me if I forgot something. (mksh is likely safe, even if I hereby remind you that the attribution requirement of the BSD-style licences still applies outside of GitHub.)
(Pet peeve: why can’t I “adopt a licence” with British spelling? They seem to require oversea barbarian spelling.)
Atlassian Bitbucket has similar terms (even worse actually; I looked at them to see whether I could mirror mksh there, and turns out, I can’t if I don’t want to lose most of what few rights I retain when publishing under a permissive licence). Gitlab seems to not have such, but requires you to indemnify them… YMMV. I think I’ll self-host the removed content.
I’m in contact with someone from GitHub Legal (not explicitly in the official capacity though) and will try to explain the sheer magnitude of the problem and ways to solve this (leaving the technical issues to technical solutions and requiring legal solutions only where strictly necessary), but for now, the ToS are enacted (another point of my criticism of this move) and thus, the aforementioned works must go off GitHub right now.
That’s not to say they may not come back later once this all has been addressed, if it will be addressed to allow that. The new ToS do have some good; for example, the old ToS said “you allow every GitHub user to fork your repositories” without ever specifying what that means. It’s just that the people over at GitHub need to understand that, both legally and technically¹, any and all OSS licences² grant enough to run a hosting platform already³, and separate explicit grants are only needed if a repository contains content not under an OSI/OKFN/Copyfree/FSF/DFSG-free licence. I have been told that “these are important issues” and been thanked for my feedback; we’ll see what comes from this.
① maybe with a little more effort on the coders’ side³
③ e.g. when displaying search results, add a note “this is an excerpt, click HERE to get to the original work in its context, with licence and attribution” where “HERE” is a backlink to the file in the repository
④ It is understood those organisations never un-approve any licence that rightfully conforms to those definitions (also in cases like a grant saying “just use any OSS² licence” which is occasionally used)
Update: In the meantime, joeyh has written not one but two insightful articles (although I disagree in some details; the new licence is only to GitHub users (D.5) and GitHub (D.4) and only within their system, so, while uploaders would violate the ToS (they cannot grant the licence) and (probably) the upstream-granted copyleft licence, this would not mean that everyone else wasn’t bound by the copyleft licence in, well, enough cases to count (yes it’s possible to construct situations in which this hurts the copyleft fraction, but no, they’re nowhere near 100%).
To prepare for future restoration projects, the Sistine Chapel's world-famous frescoes and mosaic floor have gotten the up-close-and-personal treatment by way of an army of DSLRs. The last time the Sistine's masterworks were documented photographical...
I have not seen this painting before. It's called Ivan the Terrible and His Son Ivan on 16 November 1581, and was completed in 1885 by Ilya Repin.
Warped Perspective has an article by Keri O'Shea on the painting:
It took three centuries before this scene was committed to canvas with the gravitas and horror it deserved. The man who proved himself able is arguably Russia’s best-known painter, certainly its best-known Realist painter. That man was Ilya Yefimovich Repin, who returned to historical painting in 1885 to complete ‘Ivan the Terrible and His Son Ivan’. It is to my mind one of the most haunting pieces of art ever created.
The differences between the Realist style used here and the idealised, unrepresentative portraiture of the day is exaggerated hugely by the savagery of this piece. Repin chose to paint the exact moment of Grozny’s revelation; the awful moment of stillness after the manslaughter of his heir. The two men, one living, one dead, are presented alone in a room whose fire-lit warmth gives the lie to the scene and its circumstances. That warmth, and its crimson finery is ironically juxtaposed with the blood on young Ivan’s head, which is the brightest red here, and the rich, geometric-patterned drapery in the background forms another contrast with Ivan’s curved, inanimate body, fading into nothingness before the grisly focus of the scene. There is evidence of a struggle; furniture is upended, and Ivan’s leg has disarrayed the silk rug beneath their feet – but now all is still. Horribly, terribly still.
However, for all of that, it is Grozny’s haunted expression which retains its capacity to shock. His wide eyes stare into nothing, he is lost in his thoughts; those eyes contrast utterly with the now unseeing eyes of his son. There is a lone tear on young Ivan’s cheek, as he is cradled in death by his now-penitent father, Grozny’s hands clasped ineffectually to the fatal wound. Even knowing the circumstances of this crime, I find Grozny’s expression deeply moving. To my mind, it seems like a Realist take on the Goya painting ‘Saturn Devouring His Son’ – the same blank expression, the same desperation, the same destruction of one’s young. It also creates something which often features in horror – sympathy for the monster, regardless of their deeds. This disturbing image has shocked many through the years; not least, in 1913, when Grozny’s face was badly slashed by a man called Abram Abramovich Balashov. Balashov was removed from the scene shouting, “Enough blood! Down with blood!”
The Toyota Prius made hybrids mainstream. In cities like San Francisco, you can't swing an artisanally carved reclaimed-wood stick without hitting at least one of these midsize cars rolling down the street. By sheer numbers (nearly four million sold!...
Mondo Tees has announced a line of Aliens xenomorph tiki mugs, ("in space, no one can hear you drink"), available for pre-order now with ship dates this summer (some glazes only available at Alamo Drafthouses). (more…)