Hi there! Welcome to the first (public) blog I've ever had.
After successfully returning to flight on January 14, SpaceX will make its next launch from Cape Canaveral no earlier than January 30. With this mission from a new pad at Launch Complex 39A, SpaceX will loft the EchoStar 23 communications satellite to geostationary transfer orbit.
This is a heavy satellite, weighing 5.5 metric tons, and getting it out to about 40,000 kilometers from the surface of the Earth will require pretty much all of the lift capacity of SpaceX's Falcon 9 rocket. This would leave almost no propellant for the Falcon 9 rocket to fire its engines to slow down, make a controlled descent through the Earth's atmosphere, and attempt a difficult landing on a drone ship.
On Saturday, in response to a question on Twitter, SpaceX founder and chief executive Elon Musk confirmed that the upcoming EchoStar launch will therefore indeed be expendable. "Future flights will go on Falcon Heavy or the upgraded Falcon 9," he added.
Half a year ago I wrote a blog post about debugging over email. This is a follow-up.
The blog post summarised:
Have an automated way to collect all usual informaion needed for debugging: versions, config and log files, etc.
Improve error messages so the users can solve their issues themselves.
Give users better automated diagnostics tools.
Based on further thinking and feedback, I add:
When a program notices a problem that may indicate a bug in it, it should collect the necessary information itself, automatically, in a way that the user just needs to send to the developers / support.
The primary goal should be to help people solve their own problems.
A secondary goal is to make the problem reproducible by the developers, or otherwise make it easy to fix bugs without access to the original system where the problem was manifested.
I've not written any code to help with this remote debugging, but it's something I will start experimenting with in the near future.
Further ideas welcome.
More than 30-plus years after it debuted, the Amiga continues to fascinate all sorts of computer lovers. For years our Jeremy Reimer has been thoroughly documenting its unique journey in his reoccurring series, and this is his latest entry. If new to the saga, start with part one (on the machine's genesis) and make sure to read the latest entry (part nine on the Video Toaster) before digging in.
As the 1990s began, Commodore should have been flying high. The long-awaited new Amiga models with better graphics, the A1200 and A4000, were finally released in 1992. Sales responded by increasing 17 percent over the previous year. The Video Toaster had established a niche in desktop video editing that no other computer platform could match, and the new Toaster 4000 promised to be even better than before. After a rocky start, the Amiga seemed to be hitting its stride.
Unfortunately, this success wouldn’t last. In 1993, sales fell by 20 percent, and Commodore lost $366 million. In the first quarter of 1994, the company announced a loss of $8.2 million—much better than the previous four quarters, but still not enough to turn a profit. Commodore had run into financial difficulties before, particularly in the mid-'80s, but this time the wounds were too deep. Sales of the venerable Commodore 64 had finally collapsed, and the Amiga wasn’t able to fill the gap quickly enough. The company issued a statement warning investors of its problems, and the stock plunged. On April 29, 1994, Commodore International Limited announced that it was starting the initial phase of voluntary liquidation of all of its assets and filing for bankruptcy protection. Commodore, once the savior of the Amiga, had failed to save itself.
During useR! 2016, Nick Tierney had asked on Twitter about rmarkdown and metropolis and whether folks had used RMarkdown-driven LaTeX Beamer presentations. My firm hell yeah answer, based on having used mtheme outright or in local mods for quite some time (see my talks page), lead to this blog post of mine describing this GitHub repo I had quickly set up during breaks at useR! 2016. The corresponding blog post and the repo have some more details on how I do this, in particular about local packages (also with sources on GitHub) for the non-standard fonts I use.
This week I got around to updating the repo / example a little by making the default colours (in my example) a little less awful, and a page on blocks and, most importantly, turning the example into the animated gif below:
And thanks to the beautiful tint package -- see its repo and CRAN package --- I now know how to create a template package. So if there is interest (and spare time), we could build a template package for RStudio too.
With that, may I ask a personal favour of anybody still reading the post? Please do not hit my twitter handle with questions for support. All my code is an GitHub, and issue tickets there are much preferred. Larger projects like Rcpp also have their own mailing lists, and it is much better to use those. And if you like neither, maybe ask on StackOverflow. But please don't spam my Twitter handle. Thank you.
Welcome to Ars Cardboard, our weekend look at tabletop games! Check out our complete board gaming coverage at cardboard.arstechnica.com—and let us know what you think.
The Japanese have a reputation for appreciating the toilet. The poop emoji is a creation of theirs, as is Everyone Poops—that famous kids book you almost certainly read as a young 'un—while their TV comedies have a reputation for going heavy on the ordure. Japan is also the home of those space-age lavatories with heated seats and two dozen bidet settings, and it's the birthplace of this appetizing dessert treat.
So no one should be surprised that, as we toured last October's Essen gaming fair, the largest board games show on earth, our eyes were caught by a Japanese game called Toire o Yogoshita nowa Dareda? (Or, in English, Who Soiled the Toilet?). Naturally, we had to try it.
Magnetic media, in the form of disk and tape drives, has been the dominant way of storing bits. But the speed and low power of flash memory has been displacing it from consumer systems, and various forms of long-term memory are in development that are even faster. But a new paper suggests that magnetic media may still be competitive—you just have to stop reading and writing it with magnets.
Using a specific form of garnet and some ultrafast laser pulses, a Dutch-Polish team of researchers performed what they suspect is the fastest read/write of magnetic media ever. And, for good measure, the process was extremely energy efficient.
Heat is actually a problem for both hard drives and flash. Although it doesn't create a problem in most consumer systems, dealing with excess heat is a major issue in data centers. The problem, according to the authors of the new paper, is one of scale. While we can calculate the minimum energy needed to flip a magnetic bit, we use much more than that to ensure that every bit gets written as intended. Eight orders of magnitude more, in fact. Most of that excess energy ends up dissipating into the environment, where it ends up as heat.
“Hi, you’ve reached Eran. Please leave a message, and I’ll get back to you.”
That’s my voicemail message on my cell phone that I recorded in high school and can’t figure out how to change. Although I’m still a loyal proponent of phone calls themselves, I have to admit, I probably don’t check my voicemail as much as I should.
And it’s not just me. As young people shy away from leaving voice messages when an e-mail or text message can instantly reach business colleagues, we may have moved beyond the simple answering machine.