Items tagged with: years
HN Discussion: https://news.ycombinator.com/item?id=19957405
Posted by hairytrog (karma: 267)
Post stats: Points: 132 - Comments: 54 - 2019-05-20T03:53:13Z
#HackerNews #2017 #garden #once #sealed #that #was #watered #years
HackerNewsBot debug: Calculated post rank: 106 - Loop: 111 - Rank min: 100 - Author rank: 37
HN Discussion: https://news.ycombinator.com/item?id=19950229
Posted by curtis (karma: 23903)
Post stats: Points: 89 - Comments: 66 - 2019-05-18T23:09:05Z
#HackerNews #dna #family #identify #killer #later #latest #police #science #seattle #use #years
HackerNewsBot debug: Calculated post rank: 81 - Loop: 71 - Rank min: 80 - Author rank: 102
HN Discussion: https://news.ycombinator.com/item?id=19927056
Posted by D_Alex (karma: 1393)
Post stats: Points: 101 - Comments: 73 - 2019-05-16T08:06:06Z
#HackerNews #essay #fukushima #photo #years
HackerNewsBot debug: Calculated post rank: 91 - Loop: 183 - Rank min: 80 - Author rank: 30
HN Discussion: https://news.ycombinator.com/item?id=19867120
Posted by dz0ny (karma: 1268)
Post stats: Points: 155 - Comments: 46 - 2019-05-09T11:19:21Z
#HackerNews #after #banned #business #from #paypal #years
HackerNewsBot debug: Calculated post rank: 118 - Loop: 27 - Rank min: 100 - Author rank: 67
HN Discussion: https://news.ycombinator.com/item?id=19829412
Posted by nradov (karma: 10684)
Post stats: Points: 154 - Comments: 34 - 2019-05-04T21:10:57Z
#HackerNews #all #bentley #for #lived #marie #organs #places #rose #the #with #wrong #years
HackerNewsBot debug: Calculated post rank: 114 - Loop: 102 - Rank min: 100 - Author rank: 23
HN Discussion: https://news.ycombinator.com/item?id=19828131
Posted by onetimemanytime (karma: 2023)
Post stats: Points: 112 - Comments: 98 - 2019-05-04T18:08:31Z
#HackerNews #after #are #finally #recession #rising #the #wages #years
HackerNewsBot debug: Calculated post rank: 107 - Loop: 165 - Rank min: 100 - Author rank: 16
HN Discussion: https://news.ycombinator.com/item?id=19802678
Posted by troydavis (karma: 4279)
Post stats: Points: 172 - Comments: 31 - 2019-05-01T23:23:24Z
#HackerNews #goo #later #update #world #years
HackerNewsBot debug: Calculated post rank: 125 - Loop: 112 - Rank min: 100 - Author rank: 54
HN Discussion: https://news.ycombinator.com/item?id=19689946
Posted by pseudolus (karma: 18980)
Post stats: Points: 154 - Comments: 32 - 2019-04-18T10:18:23Z
#HackerNews #airbus #airliner #and #boeing #first #japan-built #takes #years
HackerNewsBot debug: Calculated post rank: 113 - Loop: 192 - Rank min: 100 - Author rank: 79
HN Discussion: https://news.ycombinator.com/item?id=19672795
Posted by uptown (karma: 64952)
Post stats: Points: 161 - Comments: 35 - 2019-04-16T11:44:14Z
#HackerNews #access #data #discussed #documents #facebook #for #plans #sell #show #user #years
HackerNewsBot debug: Calculated post rank: 119 - Loop: 192 - Rank min: 100 - Author rank: 109
HN Discussion: https://news.ycombinator.com/item?id=19671611
Posted by thereare5lights (karma: 192)
Post stats: Points: 128 - Comments: 44 - 2019-04-16T06:42:17Z
#HackerNews #2008-9 #about #boeing #for #period #the #time #worked #years
HackerNewsBot debug: Calculated post rank: 100 - Loop: 96 - Rank min: 100 - Author rank: 15
Under pressure from Reddit's administrators over copyright issues, the site's largest forum dedicated to piracy discussion has opted for "The Nuclear Option". After voting by its contributors, all…
Article word count: 1077
HN Discussion: https://news.ycombinator.com/item?id=19600296
Posted by okket (karma: 36420)
Post stats: Points: 123 - Comments: 89 - 2019-04-07T22:15:52Z
#HackerNews #almost #avoid #ban #deleting #history #piracy #reddits #ten #years
With around a quarter of a billion monthly users, Reddit is one of the most important sites on the Internet.
The site plays host to millions of live discussions on countless topics ranging from the mundane to obviously controversial.
Recently we’ve reported on the troubles being faced by /r/piracy, Reddit’s most popular sub-Reddit focused on piracy discussion.
In an article published mid-March 2019, we reported how the moderators of the forum were making best efforts to keep content on the right side of the law and within Reddit’s rules. Just a handful of days later, however, the moderators received notice from Reddit that they were receiving too many copyright complaints from rightsholders.
For a sub-Reddit that has strict rules forbidding anyone posting links to infringing content, the notification came as a disappointment. While some complaints were legitimate (some people simply won’t abide by the rules and some posts do get missed), many were not. This placed the forum’s moderators between a rock and a hard place.
According to some of the copyright notices filed with Reddit, simply posting an alleged pirate site homepage URL warranted a complaint, even when that URL didn’t link to any infringing content. We’ve seen the same kind of issues before, when copyright holders have made attempts to have site homepages delisted from Google, despite their content never appearing there.
Further complicating the process is that the moderators of /r/piracy have no ability to respond to potentially false allegations. If a user makes a post that results in a copyright notice, only that user (or Reddit’s admins) are in a position to dispute the claim with the notice sender, so that rarely happens. Even if it does, nothing is made public.
Meanwhile, the notices keep building up, despite best efforts and whether they’re valid or not. Even people simply posting names of releases are being flagged for copyright infringement, something that isn’t illegal in any form. As a result, those posts too are now being removed, as quickly as the mods can reach them.
“I have begun unofficially removing release posts and it’s quite sad considering that a rather large bulk of our users look forward to them every day, I know I did,” moderator ‘dysgraphical‘ informs TF.
“We have had days when releases were the highlight of the day filled with hundreds of comments of excited people discussing the film. This has all been scrubbed now. We recently had an April Fool’s ‘Avengers: Endgame’ release post hit r/all and while the community was happy to meme on being fooled, a few users were concerned that copyright holders might act on it and have it removed.”
It’s nothing less than self-censorship in response to sloppy and/or fraudulent claims, but these are testing times.
But the really big issue here relates to the huge archive of posts already present on /r/piracy – some ten years’ worth of discussions. Is there anything in there that could warrant a surprise complaint? Apparently so, since rightsholders have been digging up issues from the past and complaining to Reddit.
This left the moderators of /r/piracy with a huge dilemma. Uncertain of what lay in the archives and only being in a strong position to be absolutely certain of the state of play more recently, they asked the community for input on the ‘Nuclear Option‘ – deleting every post older than six months old, just to be sure.
After the votes were counted, those in favor of deleting the archives outnumbered those asking for preservation by ten to one. All that was left was to find a way to begin deleting history, around 9.5 years of posts. A script was created and put into motion and the purge began.
“Given the speed, this might take weeks,” says moderator ‘dbzer0’, a nine-year veteran of the sub-Reddit.
It’s unclear when this sweeping process with be fully completed, but it’s hoped that it can keep the community alive. Not all of the moderators were in favor of the mass deletion since that, of course, deletes the community’s history too.
“The Scrubbing [as the deletion process is now called] is just a poorly, rushed attempt to elongate the community’s lifespan on Reddit,” dysgraphical says.
“We have already seen this performed in other subreddits in which mod teams have bent over backwards to please the administration by implementing their own set of stringent rules. These communities no longer exist.”
But the vote was cast and the final decision appears to have been a democratic one rooted in self-preservation. It does raise interesting points, however.
The recently highlighted situation shows that sub-Reddits devoted to controversial topics – especially those related to piracy – are at risk of being targeted. When they are, the copyright notice and counter-notice process is somewhat undermined.
While users can be banned for repeat infringements, it’s trivial to open a new account. And when the notices start to pile up on Reddit – legitimately or not – whole communities can be banned, despite working above and beyond the requirements of the law.
“The issue at hand is not that r/Piracy distributes copyrighted content, but rather that the discussion of digital piracy is no longer protected; it never was,” dysgraphical adds.
“As copyright holders continue pushing the envelope, by claiming that the mention of streaming sites infringe their IP, Reddit will continue complying and effectively ban r/Piracy. Copyright holders on Reddit no longer need to dig deep to find infringing content, they can pick any thread or comment at random that loosely relates to their IP, and file a DMCA takedown notice.”
To give a school analogy, it appears that if a few kids misbehave, get misinterpreted, or targeted incompetently, the whole class gets kept behind after school – before being permanently expelled. It’s effectively mass punishment based on the acts of a few – or the whims of bots.
Finally, subscriptions to /r/piracy have always been on the increase and are now edging towards 370,000 subscribers but the ongoing purge is having a clear effect on traffic to the sub-Reddit, when the two unusual peaks (including the April 1 surge) are discounted.
Reddit’s /r/piracy traffic stats
Whether the popular forum can fight back from this decline will remain to be seen but it’s clear that deleting most of its history is already causing pain. The big question is whether Reddit’s admins are taking note of this huge olive branch or whether they’ll still choose to chop down the whole tree regardless.
HackerNewsBot debug: Calculated post rank: 111 - Loop: 394 - Rank min: 100 - Author rank: 69
Linux Journal's very first issue featured an interview between LJ's first Publisher, Robert Young (who went on to co-found Red Hat among other things), and Linus Torvalds (author of the Linux kernel).…
Article word count: 4027
HN Discussion: https://news.ycombinator.com/item?id=19559970
Posted by axiomdata316 (karma: 3085)
Post stats: Points: 137 - Comments: 62 - 2019-04-03T01:01:16Z
#HackerNews #interview #later #linus #torvalds #with #years
Linux Journalʼs very first issue featured an interview between LJʼs first Publisher, Robert Young (who went on to co-found Red Hat among other things), and Linus Torvalds (author of the Linux kernel). After 25 years, we thought itʼd be interesting to get the two of them together again. You can read that first interview from 1994 here.
Interview: Linus Torvalds and Robert Young
Robert Young: It is a great pleasure to have an excuse to reach out to you. How are you and your family? Your kids must be through college by now. Nancy and I and our three daughters are all doing well. Our eldest, Zoe, who was 11 when Marc and I started Red Hat, is expecting her second child—meaning Iʼm a grandparent.
Linus Torvalds: None of my kids are actually done with college yet, although Patricia (oldest) will graduate this May. And Celeste (youngest) is in her senior year of high school, so weʼll be empty-nesters in about six months.
All three are doing fine, and I suspect/hope it will be a few years until the grandparent thing happens.
Bob: When I first interviewed you back in 1994, did you think that youʼd be still maintaining this thing in 2019?
Linus: I think that by 1994 I had already become surprised that my latest project hadnʼt just been another "do something interesting until it does everything I needed, and then find something else to do" project. Sure, it was fairly early in the development, but it had already been something that I had spent a few years on by then, and had already become something with its own life.
So I guess what Iʼm trying to say is not that I necessarily expected to do it for another few decades, but that it had already passed the bump of becoming something fairly big in my life. Iʼve never really had a long-term plan for Linux, and I have taken things one day at a time rather than worry about something five or ten years down the line.
Bob: There is a famous old quote about the danger of achieving your dreams—your running joke back in the day when asked about your future goals for Linux was "world domination". Now that you and the broader Open Source/Free Software community have achieved that, whatʼs next?
Linus: Well, I stopped doing the "world domination" joke long ago, because it seemed to become less of a joke as time went on. But it always was a joke, and it wasnʼt why I (or any of the other developers) really did what we did anyway. It was always about just making better technology and having interesting challenges.
And none of that has really changed on a core level. All the details have changed—the hardware is very different, the problems we have are very different, and my role is very different. But the whole "make it better and have interesting challenges" is all the same.
For example, back in 1994, I was mostly a developer. Sure, I was the lead maintainer, but while I spent a lot of time merging patches, I was also mostly writing my own code. These days I seldom write much code, and the code I write is often pseudo-code or example patches that I send out in emails to the real developers. Iʼd hesitate to call myself a "manager", because I donʼt really do things like yearly reviews or budgets, etc. (thank God!), but I definitely am more of a technical lead person than an actual programmer, and thatʼs been true for the last many years.
So the truly big-picture thing hasnʼt changed, but my role and all the details obviously look very very different from 1994.
Bob: Where will you and this code base be in another quarter century?
Linus: Well, Iʼll be 75 by then, and I doubt Iʼll be involved day to day. But considering that Iʼve been doing this for almost 30 years, maybe Iʼd still be following the project.
And the good news is that we really do have a pretty solid developer base, and Iʼm not worried about "where will Linus be" kind of issues. Sure, people have been talking about how kernel developers are getting older for a long time now, but thatʼs not really because we wouldnʼt be getting any new people, itʼs literally because we still have a lot of people around that have been around for a long time, and still enjoy doing it.
I used to think that some radical new and exciting OS would come around and supplant Linux some day (hey, back in 1994 I probably still thought that maybe Hurd would do it!), but itʼs not just that weʼve been doing this for a long time and are still doing very well, Iʼve also come to realize that making a new operating system is just way harder than I ever thought. It really takes a lot of effort by a lot of people, and the strength of Linux—and open source in general, of course—is very much that you can build on top of the effort of all those other people.
So unless there is some absolutely enormous shift in the computing landscape, I think Linux will be doing quite well another quarter century from now. Not because of any particular detail of the code itself, but simply fundamentally, because of the development model and the problem space.
I may not be active at that point, and a lot of the code will have been updated and replaced, but I think the project will remain.
Bob: Have you and the kernel team been updating the kernel code to your satisfaction through the years? Is there any need or pressure to re-write any of the 25-year-old ever-expanding Linux code base? Perhaps in a more "modern" language than C?
Linus: Weʼve gone through many many big rewrites of most of the subsystems over the years—not all at once, of course—and many pieces of code end up being things that nobody really wants to modify any more (most often because they are drivers for ancient hardware that very few people really use, but that we still support). But one of the advantages of a big unified source base for the whole kernel has been that when we need to make some big change, we can do so. There may be a few out-of-tree drivers, etc., around (both source and binary), but weʼve always had a policy that if they are out of tree, they donʼt matter for development. So we can make radical changes when necessary.
As to C, nothing better has come around. Weʼve updated the kernel sources for new and improved features (the C language itself has changed during the years weʼve been doing this), and weʼve added various extensions on top of C for extra type-checking and runtime verification and hardening, etc., but on the whole, the language is recognizably the same except for small details.
And honestly, it doesnʼt look likely to change. The kind of languages people see under active development arenʼt for low-level system programming. They are to make it easier to create user applications with fancy UIs, etc. They explicitly donʼt want to do things a kernel needs, like low-level manual memory management.
I could imagine that weʼd have some "framework" language for generating drivers or similar, and we internally actually have our own simplified "language" just for doing configuration, and we do use a few other languages for the build process, so itʼs not like C is the only language we use. But itʼs the bulk of it by far, and itʼs what the "kernel proper" is written in.
Bob: Whatʼs your hardware instrument of choice? Is there a Stradivarius of Linux (or any) laptops out there? Or tablet or phone?
Linus: My main development machine is a very generic PC workstation. Itʼs a franken-machine with different parts cobbled together over the years. Itʼs nothing particularly special, and itʼs actually been two years since I made any big changes to it, so itʼs not even anything bleeding-edge. My main requirement at home is actually that it be basically entirely silent. Outside a couple fans, there are no moving parts (so no spinning disks anywhere), and the fans are not even running most of the time.
On the road (which is happily not that often), my main requirement is a good screen and being lightweight. My target weight is 1kg (with charger), and honestly, Iʼve not been able to hit that ideal target, but right now, the best compromise for me is the XPS13.
Bob: It seems Linux on the desktopʼs success was not on the PC desktop but on the device desktop via Android. What are your thoughts on this?
Linus: Well, the traditional PC is obviously no longer quite the dominant thing it used to be. Even when you have one (and even when itʼs still running Windows or OS X), lots of people mainly interact with it through a web browser and a couple random apps. Of course, then there are the "workstation" users, which is kind of the desktop I was personally always envisioning. And while still important, it doesnʼt seem to drive the market the way the PC did back when. Powerful desktop machines seem to be mostly about development or gaming, or media editing. The "casual" desktop seems to have become more of a browser thing, and quite often itʼs just a tablet or a phone.
Chrome seems to be doing fine in some of that area too, of course. But yes, in just numbers of people interacting daily with Linux, Android is obviously the huge bulk of it.
[Note from Bob: In the strict sense of "dominant", this is probably fair. But despite the recent fall in total numbers of PCs shipped in the last couple years, the cumulative growth in the PC market between 1994 and, say, 2014 is such that even in a slow PC market today, the world is still installing four or five times as many PCs every year compared to 1994.]
Bob: If you had to fix one thing about the networked world, what would it be?
Linus Torvalds (Image Courtesy of Peter Adams, The Faces of Open Source Project)
Linus: Nothing technical. But, I absolutely detest modern "social media"—Twitter, Facebook, Instagram. Itʼs a disease. It seems to encourage bad behavior.
I think part of it is something that email shares too, and that Iʼve said before: "On the internet, nobody can hear you being subtle". When youʼre not talking to somebody face to face, and you miss all the normal social cues, itʼs easy to miss humor and sarcasm, but itʼs also very easy to overlook the reaction of the recipient, so you get things like flame wars, etc., that might not happen as easily with face-to-face interaction.
But email still works. You still have to put in the effort to write it, and thereʼs generally some actual content (technical or otherwise). The whole "liking" and "sharing" model is just garbage. There is no effort and no quality control. In fact, itʼs all geared to the reverse of quality control, with lowest common denominator targets, and click-bait, and things designed to generate an emotional response, often one of moral outrage.
Add in anonymity, and itʼs just disgusting. When you donʼt even put your real name on your garbage (or the garbage you share or like), it really doesnʼt help.
Iʼm actually one of those people who thinks that anonymity is overrated. Some people confuse privacy and anonymity and think they go hand in hand, and that protecting privacy means that you need to protect anonymity. I think thatʼs wrong. Anonymity is important if youʼre a whistle-blower, but if you cannot prove your identity, your crazy rant on some social-media platform shouldnʼt be visible, and you shouldnʼt be able to share it or like it.
Oh well. Rant over. Iʼm not on any social media (I tried G+ for a while, because the people on it werenʼt the mindless usual stuff, but it obviously never went anywhere), but it still annoys me.
Bob: This issue of Linux Journal focuses on Kids and Linux. Is there any advice youʼd like to give to young programmers/computer science students?
Linus: Iʼm actually the worst person to ask. I knew I was interested in math and computers since an early age, and I was largely self-taught until university. And everything I did was fairly self-driven. So I donʼt understand the problems people face when they say "what should I do?" Itʼs not where I came from at all.
Bob: The very first time you and I met was at a Digital Equipment Company (DEC) tradeshow. It was on your very first trip to the US that Jon "maddog" Hall and DEC financed.
Linus: I think actually that was my second trip to the US. The first was, I believe, a trip for me to Provo, Utah, to talk with Novell about Linux (for a project inside Novell that was then to become Caldera).
But yes, the DECUS tradeshow (in New Orleans? Maybe I misremember) was certainly among my earliest trips to the US.
Bob: I asked how you were going to catch up with all the emails you missed by the time you returned to Helsinki. Your answer surprised me, and Iʼve been quoting you ever since. You simply said you would send the backlog of emails to /dev/null. I expressed shock and asked you, "but what if there were important emails in your inbox?" You shrugged and replied, "If it was important, the writer would just send it again." Possibly the most liberating piece of advice anyone had ever given me. Do you still follow that philosophy of email handling?
Linus: Itʼs still somewhat true, but at the same time, Iʼve also changed my workflow a lot so that travel wouldnʼt be as disruptive to my work as it used to be. So these days I often strive to have people not even notice when Iʼm on the road all that much. I will give people a heads-up if I expect to be without much internet connectivity for more than a day or two (which still happens in some places of the world—particularly if youʼre a scuba diver), but most of the time, I can do my work from anywhere in the world. And I try (and sometimes fail) to time my trips so that theyʼre not in the merge window for me, which is when I get the most pull requests.
So these days I keep all my email in the cloud, which makes it much easier to switch between machines, and it means that when I travel and use my laptop, itʼs not nearly as much of a pain as it used to be back in the days when I downloaded all my email to my local machine.
And itʼs not just about my email—the fact that almost all the kernel development ends up being distributed through git also means that itʼs much less of an issue what machine I am at, and synchronization is so much easier than it used to be back when I was working with patches coming in individually through email.
Still, my "if itʼs really important, people will re-send" belief stands. People know that Iʼm around pretty much 7/365, and if I donʼt react to a pull request in a couple days, it still means that it might have gotten lost in the chaos that is my email, and people send me a follow-up email to ping me about it.
But itʼs actually much less common than it used to be. Back in 1994, I wasnʼt all that overworked, and being gone a week wasnʼt a big deal, but it got progressively worse during the next few years, to the point where our old email-and-patches-based workflow really meant that I would sometimes have to skip patches because I didnʼt have the time for them, knowing that people would re-send.
Those times are all happily long gone. BitKeeper made a big difference for me, even if not all maintainers liked it (or used it). And now git means that I donʼt get thousands of patches by email any more, and my inbox doesnʼt look as bad as it used to be. So itʼs easier to stay on top of it.
By the way, perhaps even more important than the "If it was important the writer would just send it again" rule is another rule Iʼve had for the longest time: if I donʼt have to reply, I donʼt. If I get a piece of email and my reaction is that somebody else could have handled it, I will just ignore it. Some busy email people have an automatic reply saying "sorry, Iʼll try to get to your email eventually". Me, I just ignore anything where I feel it doesnʼt absolutely concern me. I do that simply because I feel like I canʼt afford to encourage people to email me more.
So I get a lot of email, but I donʼt actually answer most of it at all. In a very real sense, much of my job is to be on top of things and know whatʼs going on. So I see a lot of emails, but I donʼt usually write a lot.
Bob: At a talk at the Washington DC Linux user group meeting back in May 1995, that Don Becker organized, you stopped halfway through and asked the audience if anyone knew the score of the Finland-Sweden menʼs world championship hockey game. As the token Canadian in the room, I was able to assure you that Finland won that game. On that topic: Finlandʼs recent win of the World Junior Championship must have been fun for you. Or were you cheering for the US?
Linus: Heh. Hockey may be the Finnish national sport (and playing against Sweden makes it more personal—I speak Swedish as my mother language, but Iʼm Finnish when it comes to nationality), but Iʼm not a huge sports fan. And moving to the US didnʼt mean that I picked up baseball and football, it just meant that ice hockey lost that "people around me cared" part too.
Bob: Many of us admire your willingness to call a spade a spade in public debates on Linux technology decisions. Others, um, dislike your forthright style of arguing. Do you think you are becoming more or less diplomatic as time has goes on?
Linus: If anything, I think I have become quieter. I wouldnʼt say "more diplomatic", but perhaps more self-aware, and Iʼm trying to be less forceful.
Part of it is that people read me a different way from how they used to. It used to be a more free-wheeling environment, and we were a group of geeks having fun and playing around. Itʼs not quite the same environment any more. Itʼs not as personal, for one thing—we have thousands of people involved with development now, and thatʼs just counting actual people sending patches, not all the people working around it.
And part of the whole "read me in a different way" is that people take me seriously in a way they didnʼt do back in 1994. And thatʼs absolutely not some kind of complaint about how I wasnʼt taken seriously back then—quite the reverse. Itʼs more me grumbling that people take me much too seriously now, and I canʼt say silly stupid cr*p any more.
So Iʼll still call out people (and particularly companies) for doing dumb things, but now I have to do it knowing that itʼs news, and me giving some company the finger will be remembered for a decade afterwards. Whether deserved or not, it might not be worth it.
Bob: Anything else you want to comment on, either publicly or otherwise?
Linus: Iʼve never had some "message" that I wanted to spread, so ...
About Robert Young and What Heʼs Been Up to in the Past 25 Years
Graduating from the University of Toronto in 1976 after studying history, Young took a job selling typewriters. In 1978, he founded his first company and then spent 15 years in Canada at the helm of two computer-leasing companies. He sold the second of these to a larger firm who moved him to Connecticut in 1992 to grow their small US subsidiary. Shortly after, the new parent company ran into financial difficulties, otherwise known as bankruptcy, and Young found himself working out of his wifeʼs sewing closet.
Robert Young, LJʼs First Publisher
Although that event led directly to, in 1993, co-founding Red Hat (NYSE: RHT) with Marc Ewing, a young North Carolina-based software engineer. Both of them had fallen in love with free software, now known as open source—Ewing because he could innovate with software that came with source code and a license that allowed him to innovate, and Young because he could see how technology customers could be better served with open technology than the closed proprietary alternatives the industry offered at the time. Serving as CEO from founding through Red Hatʼs IPO in 1999, he then moved to the role of Chairman, and the brilliant Matthew Szulik took over as CEO, building the early Red Hat into a great business. Red Hat is now a member of the S&P 500 Index of the largest US public companies.
In 2000, Young and Ewing co-founded the Center for Public Domain, a non-profit foundation created to bolster healthy conversation of intellectual property, patent and copyright law, and the management of the public domain for the common good. Grant recipients included the Electronic Frontier Foundation and the Creative Commons.
In 2003, Young purchased the Hamilton Tiger-Cats of the Canadian Football League, and he currently serves as the leagueʼs Vice-Chairman.
Working with a talented team led by Gart Davis, he helped launch Lulu.com in 2004 as the first online self-publishing services to use print-on-demand technology to enable a new generation of authors to bring their works directly to market, avoiding the delays, expense and limited profitability of publishing through traditional channels. Under the direction of Kathy Hensgen, Lulu continues to be a leading innovator helping authors bring their works to market.
In 2012 Young invested in PrecisionHawk, a small drone company led by Ernie Earon and Christopher Dean. PrecisionHawk, based in Raleigh, has become one of the leading drone technology companies in the US. He continues to serve as Chairman, with CEO Michael Chasen.
Since 2016, Young has been involved with Scott Mitchell and a team based in Toronto, helping organize the Canadian Premier League, a professional soccer league in Canada. He owns the Hamilton Forge franchise. The league will begin play this month (April 2019).
His favorite current project is helping his wife Nancy run Raleigh-based Elizabeth Bradley Design Ltd and its Needlepoint.com store, a leading needlepoint supplier. Their mission is nothing less than to make the world a more beautiful place, by growing the community of enthusiastic needlepointers around the world.
His most beloved pastime is spending time with his growing family. He and his wife Nancy welcomed their first grandchild a year ago. Young also enjoys pursuing a bunch of hobbies, always badly. These include fly fishing, kite boarding, golf, and he collects the occasional antique typewriter—a nod to his beginnings as a typewriter salesman.
Sidenote: the Faces of Open Source Project
The photo of Linus in this article is by Peter Adams, a photographer I met a few months ago when he introduced me to a series he started in 2014 called Faces of Open Source. On that site, Peter writes, "Despite its wide ranging impact, the open source revolution remains all but unknown to most people who now, more than ever before, depend on its survival. This project is an attempt to change that." His purpose applies not only to the muggles who rely on open source, but to the wizards who write their own code and put it to use. Knowing who created the Open Source world we have now will surely help as we code up a future that embodies the same good values.—Doc Searls
HackerNewsBot debug: Calculated post rank: 112 - Loop: 186 - Rank min: 100 - Author rank: 41
Five years ago, Lenz (one of our co-founders) wrote: At iwantmyname everyone earns the same. This sounds strange to many and I get asked a lot of questio……
HN Discussion: https://news.ycombinator.com/item?id=19538107
Posted by polymetis (karma: 46)
Post stats: Points: 127 - Comments: 51 - 2019-03-31T20:21:04Z
#HackerNews #experiment #one-salary #ten #the #years
picture of a person experimenting with electricity
Five years ago, Lenz (one of our co-founders) wrote:
At iwantmyname everyone earns the same. This sounds strange to many and I get asked a lot of questions how this may work once we grow bigger and the honest answer is: “I don’t know, but so far it works” and I give that same answer since we started 5 years ago. The underlying idea has two main roots. First, we really think that everyone is as important to the success of our team as anyone else in the team. We don’t believe in a hierarchy or in more important people. If we hire you, we think you are valuable and want you to be part of our team as a level peer, not an underlying that does the stuff no one else wants to do.
Soon after, it hit the Hacker News front page, and the top comment went as follows (from a person called fishtoaster):
That’s a cool experiment. It’s always neat to see people trying new, weird ways to run a company. That said, I would predict the following: * As the company grows, they have trouble hiring specialists or more senior people, since they’re competing with other companies for those people, but without the flexibility to offer a comparable salary. They could solve this by paying their highest-paid person what they’re worth, and everyone else the same, but that could be prohibitively expensive. * The need will develop for people who, though valuable, are plentiful (e.g., a janitor, but fill in any role here that’s generally near the bottom of the pay scale). The decision will be “We’d really like a janitor, but not enough to pay $X”, where X is their everyone-salary (which has to be high enough to attract their most valuable people). As such, they’ll be hard-pressed to hire roles that aren’t really worth that much to them. * Of course, you can solve either of those by having more money than you know what to do with. So, if they’re wildly profitable, it’s a system that’ll keep working. That’s just my prediction, though. I’d love to see a followup blog post in a few years describing how it went.
Well, fishtoaster… this one’s for you.
One thing that happened over the last 5-10 years is that people’s idea of what “market rate” is for a remote developer job seems to be shifting to the market rate of San Francisco. And that’s a tough place to play if you A. don’t have beaucoup VC money, B. aren’t sitting on a pile of disposable cash.
For instance, according to PayScale, the average software developer in Wellington NZ makes NZ$64k. We pay more than that, but when you become a remote company, people start looking for the $134k USD salary people are making in the top 10% of SF (which seems a bit low to me, but I’ll take their word for it). In one beautiful locale we pay in the top 10%, and in the other, we’re pretty meh.
So when fishtoaster says,
The decision will be “We’d really like a janitor, but not enough to pay $X”, where X is their everyone-salary (which has to be high enough to attract their most valuable people).
… I feel that. It’s a legitimate concern. Basically, we’re playing a game where the success of the company hinges on our ability to hire good developers, but we can’t offer top SF rates without shrinking our staff because our operating costs would be too high due to our flat structure. Fortunately for us, we’re competing not only on salary, but freedom, and that shouldn’t be taken lightly.
For as much as we hear about greed and people doing anything for money, the truth is that the world is large and full of people with different motivations. We’ve had people in emerging economies mention top-10% SF rates, but we’ve also had people in the more expensive parts of EU work for less because of the lifestyle we can somewhat uniquely offer. For example, I could probably make more in a management role elsewhere, but at iwantmyname, I get to pick my kids up from school every day without fail. To me, that’s worth a whole lot, but to someone else, it might mean very little. We’re all motivated by very different things.
I’m sure certain individuals are paid somewhat below market value while others are paid handsomely for their role. This can hurt when hiring employees in the former category. The flat hierarchy, general job benefits, and culture need to make up for the market pay-cut.
(Random quotes are from a team poll I did about our one-salary structure. You’ll see them sprinkled throughout.)
For us, everything has mostly worked out — we offer a certain amount, have had little trouble filling job openings, and our turnover is low (and to my knowledge, no one who has ever left did it primarily because of our pay structure). That said, I do feel the pressure of getting us closer to SF-senior-dev competitive wages because recruitment will inevitably get harder if the pay gap becomes too wide. We’ll never be able to pay the top-end salary of a Fortune 500 company under our current structure unless we stumble our way into a mind-blowing new market segment, but we can pull ourselves closer without making non-dev positions unreasonably expensive.
To be able to hire and retain good staff, the amount also needs to be focused on the high end of what’s needed to be competitive. What may be a good salary for support or marketing or whatever elsewhere may be quite low for good development talent, for example, and people are rarely, or only to a very limited degree, willing to accept a pay cut because they like a company’s culture or the projects they’ll be doing, etc.
Motivation and turnover
Turnover is a tricky thing to talk about because it feels like weakness, but it’s impossible to avoid when running a business. And it’s important to know that all companies are working from a different baseline. In a previous life, I worked at an ad agency that somewhat purposefully churned through recent college grads to maximize staff ROI (turns out, cheap labor being client-billed for ~$150/hr is a good way for owners to get rich). That world is different than this world though — clients came and went, onboarding was basically instant because each project was from scratch, and the work was more about immediate impact than retaining institutional knowledge. Turnover there was like watching the seasons change. So it went.
In a small tech shop like iwantmyname though, institutional knowledge is everything because nearly nothing is built from the ground up. And onboarding is relatively painful because we don’t have people we can just dedicate to training. Turnover sucks, so we do our best to avoid it. And I think we’ve done a pretty good job. Here’s what our “stats” look like:
* 20 total employees * all but four are still involved with the company * all four that left were developers
As I said before, overall retention is good… much better than at any job I’ve had in the past. But while the sample size is small, there is a relationship between market rate and turnover. While we have 100% retention for non-developers, our typical dev window is roughly three years (again, small sample size… compounded by the fact that iwantmyname basically didn’t do any hiring for the first five years).
Here’s the thing though. Even in Silicon Valley, where salaries are extremely high across the board and employees are showered in benefits, tech retention is low. I don’t think income is the primary motivator here (although I’m sure it helps with recruiting).
My general theory is that people are motivated by two things in life: pain avoidance and happiness. No matter what you do, you’re always trying to find the thing that brings the most happiness with the least amount of pain. Some people find higher peak happiness through suffering, and some people are hedonists who work to experience minimal pain, but both are operating under the same framework.
So what we’re looking at here is a dev vs non-dev marketplace of employees. Every individual is different, but most people find happiness at work through money and perks (remote work, working on stuff you like, freedom to make decisions, etc.). The tangle is that in this market, non-devs tend to experience pain while looking for jobs, while even mediocre devs routinely get contacted by recruiters offering ++ salaries and signing bonuses. There’s no friction for a developer, and without friction, there’s no pain. So as soon as things lose their shine, stats show that developers tend to move on. And it’s understandable — why stagnate when there are endless opportunities to do new and interesting things?
From a management perspective, all I can do is optimize the things I can control to reduce pain. Here are the three things I focus on:
1. Freedom. The worst part of being in a traditional office is seeing how fast your calendar gets booked. Endless meetings, pointless team-building activities, random reporting reminders to make middle managers happy. Some of it is useful, but I do my very best to clear up people’s schedules so they can maximize their time not working. To me, the ideal workplace is one that lets me enjoy my own life… not one that tries to merge with it.
2. Dysfunction. People tend to not like dysfunction, and dysfunction is really just a symptom of not understanding priorities. Yes, some people are just lazy, but if you can solve laziness, the best thing a manager can do is to be extraordinarily clear about the work ahead. Generally, if you can get smart people to work towards the same goals while avoiding institutional confusion, they’ll achieve great things.
3. Fairness. The other thing people tend to not like is the feeling that their peers aren’t holding up their end of the bargain. And it’s especially bad when the person telling the team what to do isn’t doing anything themselves. The best way to make this a non-issue is to get dirty from the top-down. No one is above the rest — not even managers.
One-salary makes everyone feel they are an equal part of the team. Whether they’re in support, dev, or product areas, all have an equal voice. That equality helps avoid the “that’s above my pay grade, let someone else deal with it” attitude.
(A completely flat structure) negates the need for traditional performance reviews and salary negotiation, which are very difficult for some people and do not favour those who have not been raised/trained/educated to advocate and negotiate hard for themselves. (And can be actively penalized for it, in the cases of some demographics.)
Is there a better way?
When I talk about our flat salary, I often come at it from the angle of focusing on what alternatives would buy us. Barring a cash infusion, we have X dollars that can be allocated however we want. Right now we know precisely how much each position costs (because we all cost the same), but if we moved off our flat structure, what would that look like?
My best guess is that developer rates would go up, support rates would go down, and everyone else would be roughly the same. But it’s not like rates would change proportionally. At best it’s a wash, with support rates going down exactly the same as developer rates are going up, but that’s silly. SF rates for senior devs would cost far more than the savings we’d make on the support side, and the talent drain in support would make for a far inferior product. Plus, our support staff does a lot more than their direct tasks — they’re worth every penny they get paid.
In reality, our flat-salary has probably allowed the company to grow faster than it would’ve under a traditional structure. I don’t think that was the intention going in, but it’s essentially put a movable ceiling on rates that have increased exponentially since we started around the 2008 recession. Our salaries are tied to profit, and while they’ve gone up over time, they just don’t compare to the whims of a company tied to an out-of-control market or VC-infused debt.
To me (and to be clear, I’m part of the flat salary with no ownership stake), it’s a fair deal because everything was on the table from day one. Everyone makes X, there’s no excessive Apple-esque cash hoard that’s being syphoned to investors instead of the staff, and we’re free to leave at any time.
With that said, if we were to change, here are three models that seem acceptable (all have pros and cons):
1. From an employee standpoint, the Basecamp model is a good way to maximize the retention and recruitment of “senior” talent while mostly keeping out the ugly popularity contests that come with salary negotiations. My understanding based on my recollection of the book It Doesn’t Have to Be Crazy at Work is that hey pay an upper-tier salary per position based (I think) on SF rates. They also use a junior/senior tier, which allows them to recruit raw talent without worrying about fair workloads (it’s be hard to justify paying the same salary to two devs who have vastly different levels of experience, but there’s also a lot of “grunt work” that needs to be done that might drive a senior person away).
2. This one was floated by someone in the internal survey: ”A possible solution to retaining talent longer term (who may start looking to move on for financial reasons) is to add a ‘years with the company’ accelerator. Putting the fiscal focus on retaining experience and talent seems more logical to me than hiring a ‘rockstar’ candidate and offering them double their colleague’s salary so they can shoot through some work and leave in a year or two.’’ I generally like this, but it would probably have to be capped at some amount to avoid paying people too far above market rates. While retention is important, I don’t know if it’d make business sense to, say, pay someone triple what a comparable replacement would cost. And you don’t want to be in the position of laying off a tenured staffer because you mistakenly started paying them too much.
3. I’m not sure how the accounting would work, but if we kept the salary the same and tied it to a quarterly bonus structure based on company profits, it could create a more direct link between short-term productivity and income. (That’s not really a fundamental change though… more like a nudge.)
Anyways, fishtoaster, here we are — getting by with a system that’s held up strong for ten years. We don’t have “more money than we know what to do with” (YET!!!), but I think everyone would agree that iwantmyname sits in the successful column of small businesses. And to me, that’s a win in the one-salary column.
I’ll make a note on my calendar to update this in 2024.
HackerNewsBot debug: Calculated post rank: 101 - Loop: 91 - Rank min: 100 - Author rank: 460
Years-old posts by Facebook CEO Mark Zuckerberg have disappeared — meaning key moments in the company's history have vanished.
Article word count: 1159
HN Discussion: https://news.ycombinator.com/item?id=19527200
Posted by crones (karma: 84)
Post stats: Points: 154 - Comments: 67 - 2019-03-30T02:15:20Z
#HackerNews #deleted #facebook #mark #mistakenly #old #posts #years #zuckerbergs
Old Facebook posts by Mark Zuckerberg have disappeared — obscuring details about core moments in Facebookʼs history.
On multiple occassions, years-old public posts made by the 34-year-old billionaire chief executive that were previously public and reported on by news outlets at the time have since vanished, Business Insider has found. That includes all of the posts he made during 2007 and 2008.
Reached for comment, a Facebook spokesperson said the posts were "mistakenly deleted" due to "technical errors."
"A few years ago some of Markʼs posts were mistakenly deleted due to technical errors. The work required to restore them would have been extensive and not guaranteed to be successful so we didnʼt do it," the spokesperson said in a statement.
"We agree people should be able to find information about past announcements and major company news, which is why for years weʼve shared and archived this information publicly — first on our blog and in recent years on our Newsroom."
These disappearances, along with other changes Facebook has made to how it saves its archive of announcements and blog posts, make it much harder to parse the social networkʼs historical record. This makes it far more difficult to hold the company, and Zuckerberg himself, accountable to past statements — particularly during a period of intense scrutiny of the company in the wake of a string of scandals.
The very nature of the issue means it is extremely challenging to make a full accounting of what exactly what has gone missing over the years. The spokesperson said they didnʼt know how many posts in total were deleted.
The curious case of Mark Zuckerbergʼs vanishing Facebook posts
In April 2012, Facebook acquired Instagram — a now-pivotal moment in the growth of the Menlo Park, California technology giant. Multiple news reports from the time quoted from a public post that Zuckerberg made on his timeline about the acquisition — but that post now inaccessible.
The links to that post from old news articles no longer work, and itʼs nowhere to be seen on his profile.
It was an important document in the history of Facebook, particularly given Zuckerberg promised that "weʼre committed to building and growing Instagram independently" — a commitment he has since walked back. Facebook is now integrating the photo-sharing app into itself ever-more closely, and tensions around this contributed to the departure of Instagramʼs two cofounders in September 2018. (A copy of Zuckerbrergʼs post has also been preserved in Facebookʼs Newsroom blog.)
READ MORE:Facebook exec Andrew Bosworth broke the social networkʼs rule on using your real name for 8 years
The most drastic deletions involved entire years. Throughout both 2006 and 2009, Zuckerberg was regularly active on the social network — but there are no posts visible of any kind for the two full years in between. The spokesperson confirmed that all the posts during 2007 and 2008 were deleted.
This is the error message you see when you click on a link to one of Mark Zuckerbergʼs vanished Facebook posts. Facebook
Another, specific example from later on: Facebookʼs beloved head chef Josef Desimone died in a motorcycle crash in July 2013. TechCrunch reported at the time that Zuckerberg shared the news in a post on Facebook. However, that post is now inaccessible as well.
Facebook would go on to throw a party in Desimoneʼs memory at its headquarters the following month. Hundreds of people were invited, and booze flowed freely — and it subsequently descended into chaos. As Business Insider previously reported, multiple fights broke out among attendees, which security staff believed were gang-related.
Numerous other posts by Zuckerberg from these time periods remain publicly available.
Facebook has also made it harder to navigate its archives of old announcements
Lastly, there have been issues accessing Facebookʼs archive of older blog posts.
In years past, Facebook had a dedicated blog that announcements would often be posted to and which was navigable by month; an archived example of a post is available here, via the Internet Archive. But at some point — itʼs not clear when, exactly — Facebook launched its new "Newsroom," a repository for its key announcements, and broke the public links to old blog posts.
Now, when you click on a link to a blog post included in an old news story, it redirects you to the Newsroom. The Newsroom doesnʼt have copies of many of these old blog posts, meaning thereʼs no easy way to access them.
They do still exist in one form — as a "note" saved to Facebookʼs public "Facebook" page on the social network. But until today there was no centralized archive through which to browse them, like what was available for the Facebook blog, or like what exists today for Newsroom posts.
Instead, to read a specific one, you had to either know about it already and search for keywords on Google, or scroll back through the Facebook pageʼs thousands of posts over the years.
After Business Insider reached out for comment, Facebook added a public "notes" tab to the Facebook page to access them. As of press time, however, the notes are loading extremely slowly.
READ MORE: Facebook secretly explored building bird-size drones to ferry data to people with bad internet connections
Take, for example, the 2006 launch of the News Feed, which is now an advertising juggernaut that now makes billions of dollars for the company. Zuckerberg tried to quash early user backlash against the News Feed with a blog post called "Calm down. Breathe. We hear you," but a link to it in a TechCrunch news report from the time now just redirects to the Newsroom homepage.
Thereʼs no copy of the blog post in the Newsroom, and itʼs currently only available as a note from Facebook. Hereʼs how it looked prior to the closure of the blog, according to the Internet Archive.
The net effect of this change to the archives was to drastically obfuscate Facebookʼs historical record — making it far harder to find past statements and announcements from the company about itself.
Mark Zuckerbergʼs content has gone AWOL before
This isnʼt the first time Zuckerberg-related material has disappeared without warning from Facebook.
In April 2018, TechCrunch reported that messages sent by the CEO were being deleted from other peopleʼs inboxes without their knowledge or consent — a feature that wasnʼt available to ordinary Facebook users at the time.
And back in November 2016, public posts from Zuckerberg about the media and Facebookʼs role in the 2016 US election also disappeared, The Verge reported at the time.
At the time, a spokesperson told the Verge that their removal was an accident and subsequently restored them.
Do you work at Facebook? Got a tip? Contact this reporter via Signal at +1 (650) 636-6268 using a non-work phone, email at firstname.lastname@example.org, Telegram or WeChat at robaeprice, or Twitter DM at @robaeprice. (PR pitches by email only please.) You can also contact Business Insider securely via SecureDrop.
HackerNewsBot debug: Calculated post rank: 125 - Loop: 54 - Rank min: 100 - Author rank: 280
Alcohol that makes you feel drunk without the coinciding hangover may be available within five years, according to researchers.
Alcohol that makes you feel drunk without the coinciding hangover may be available within five years, according to researchers.
3/30/06 From The Lord, Our God and Savior
The Word of The Lord Spoken to Timothy
For All Those Who Have Ears to Hear
Thus says The Lord, The God of #Israel, The Only Lord of Hosts: My anger is aroused and shall not be quenched, until every tall #tower is torn down and shaken to #dust, and not one fenced #city is left standing! In My #hot displeasure, I shall turn My hand against the #earth, and smite every #bird, #beast and crawling thing! Every #fish shall die, every #creature of the #sea shall perish, when I strike the #waters! I shall not relent nor turn back, until every #blade of #grass is burned up, and every #tree has been stricken and bears neither #leaf nor #fruit! The earth shall be forsaken for a #time and made utterly desolate, with the #kingdoms of #men left in ruins!
Therefore gather together, O #children of the #man of perdition, all you who bear the #number of his #name; come and gather against My #Holy #Mountain! Yes, look up and behold the #Glory of God wrought in His #Mighty and #Strong One, The Holy One of #Israel, and prepare to #meet your #end! - #Consumed in His glory, cut asunder by the #sword of His #mouth! For you are accursed in My #sight, and there shall be none to #deliver! Your #bodies shall consume away; your #eyes in their #sockets, your #tongues in your #mouths, your #flesh from your #bones! For thus is the #reward of all who come out to #fight against Me in that day!... It is finished.
Behold a new day, the seventh, even one thousand #years!...
The #kingdoms of this #world have become
The #Kingdom of God and His #Messiah,
And He shall reign forever and ever!...
Says The Lord.
↑ Zechariah 14:12 (HNV)
#prophecy #prophet #Jesus #Yeshua #Christ #Messiah #God #church #bible #scripture #christian #christianity #JesusChrist #HolySpirit #Savior #Saviour #Lord
To continue, please click the box below to let us know you're not a robot.
HN Discussion: https://news.ycombinator.com/item?id=19350060
Posted by jonbaer (karma: 43186)
Post stats: Points: 91 - Comments: 45 - 2019-03-10T03:47:46Z
#HackerNews #chinas #finds #for #gdp #growth #inflated #nine #pace #study #was #years
To continue, please click the box below to let us know youʼre not a robot.
HackerNewsBot debug: Calculated post rank: 75 - Loop: 40 - Rank min: 60 - Author rank: 41
In this graphic, the University of Cambridge’s Luke Kemp compiled a list of civilisations to compare how long they lasted.
Article word count: 399
HN Discussion: https://news.ycombinator.com/item?id=19307759
Posted by pitzahoy (karma: 102)
Post stats: Points: 108 - Comments: 92 - 2019-03-05T03:02:27Z
#HackerNews #336 #ancient #average #civilization #lasted #the #years
One way to look at the rise and fall of past civilisations is to compare their longevity. This can be difficult, because there is no strict definition of civilisation, nor an overarching database of their births and deaths.
In the graphic below, I have compared the lifespan of various civilisations, which I define as a society with agriculture, multiple cities, military dominance in its geographical region and a continuous political structure. Given this definition, all empires are civilisations, but not all civilisations are empires. The data is drawn from two studies on the growth and decline of empires (for 3000-600BC and 600BC-600), and an informal, crowd-sourced survey of ancient civilisations (which I have amended).
Pinch/zoom to enlarge (Credit: Nigel Hawtin)
Civilisation [Duration in years]
Ancient Egypt, Old Kingdom
Ancient Egypt, Middle Kingdom
Ancient Egypt, New Kingdom
Norte Chico Civilisation
Harappan Civilisation (Indus Valley Civilisation)
Elam Civilisation (Awan Dynasty)
Minoan Civilisation (Protopalatial)
Third Dynasty of Ur
Old Assyrian Empire
Middle Assyrian Empire
Neo Assyrian Empire
Elam Civilisation (Eparti Dynasty)
First Babylonian Dynasty
Old Hittie Empire
Minoan Civilisation (Neopalatial)
Middle Hittite Kingdom
Elam Civilisation (Middle Elamite Period)
New Hittite Kingdom
Zhou Dynasty (Western Period)
Kingdom of Israel and Judah
Zhou Dynasty (Eastern Zhou Spring Period)
Zhou Dynasty (Eastern Zhou Warring States Period)
Elam Civilisation (Neo-Elamite Period)
Chaldean Dynasty (Babylon)
First Chera Empire
Early Chola Empire
Han Dynasty (Western Period)
Kingdom of Armenia
Hsiung Nu Han
Three Kingdoms of Korea
Han Dynasty (Eastern Period)
Tʼu Chueh Turk
Luke Kemp is a researcher based at the Centre for the Study of Existential Risk at the University of Cambridge. He tweets @lukakemp.
Join 900,000+ Future fans by liking us on Facebook, or follow us on Twitter or Instagram.
If you liked this story, sign up for the weekly bbc.com features newsletter, called “If You Only Read 6 Things This Week”. A handpicked selection of stories from BBC Future, Culture, Capital, and Travel, delivered to your inbox every Friday.
HackerNewsBot debug: Calculated post rank: 102 - Loop: 76 - Rank min: 100 - Author rank: 35
Its creation is a perfect illustration of how science progresses
Article word count: 3995
HN Discussion: https://news.ycombinator.com/item?id=19285105
Posted by jkuria (karma: 6508)
Post stats: Points: 143 - Comments: 26 - 2019-03-01T20:47:57Z
#HackerNews #150 #old #periodic #table #the #this #week #years
“LA république n’a pas besoin de savants ni de chimistes.” With that curt dismissal a court in revolutionary France cut short the life of Antoine-Laurent de Lavoisier, argued by some to be the greatest chemist of all. Lavoisier’s sin was tax farming. He had been a member of the firm that collected the monarchy’s various imposts and then, having taken its cut, passed what remained on to the royal treasury. That he and many of his fellow farmers met their ends beneath a guillotine’s blade is no surprise. What had distinguished Lavoisier from his fellows, though, was what he chose to spend his income on. For much of it went to create the best-equipped chemistry laboratory in Europe.
Nothing comes of nothing. Where the story of the periodic table of the elements really starts is debatable. But Lavoisier’s laboratory is as good a place as any to begin, for it was Lavoisier who published the first putatively comprehensive list of chemical elements—substances incapable of being broken down by chemical reactions into other substances—and it was Lavoisier and his wife Marie-Anne who pioneered the technique of measuring quantitatively what went into and came out of a chemical reaction, as a way of getting to the heart of what such a reaction really is.
Get our daily newsletter
Upgrade your inbox and get our Daily Dispatch and Editorʼs Picks.
Lavoisier’s list of elements, published in 1789, five years before his execution, had 33 entries. Of those, 23—a fifth of the total now recognised—have stood the test of time. Some, like gold, iron and sulphur, had been known since ancient days. Others, like manganese, molybdenum and tungsten, were recent discoveries. What the list did not have was a structure. It was, avant la lettre, a stamp collection. But the album was missing.
Creating that album, filling it and understanding why it is the way it is took a century and a half. It is now, though, a familiar feature of every high-school science laboratory. Its rows and columns of rectangles, each containing a one- or two-letter abbreviation of the name of an element, together with its sequential atomic number, represent an order and underlying structure to the universe that would have astonished Lavoisier. It is little exaggeration to say that almost everything in modern science is connected, usually at only one or two removes, to the periodic table.
The Lavoisiers’ careful measurements had discovered something now thought commonplace—the law of conservation of matter. Chemistry transforms the nature of substances, but not their total mass. That fact established, another Frenchman, Louis-Joseph Proust, extended the idea with the law of definite proportions. This law, published in 1794, the year of Antoine Lavoisier’s execution, states that the ratio by weight of the elements in a chemical compound is always the same. It does not depend on that compound’s method of preparation. From there, it might have been a short step for Proust to arrive at the idea of compounds being made of particles of different weights, each weight representing a specific element. But he did not take it. That insight had to wait for John Dalton, a man who was the polar opposite of the aristocratic bon vivant Lavoisier. Dalton’s parents were so poor that he had been put to work at the age of ten. The man himself was an ascetic, colour-blind Quaker. And he was English.
Dalton lived in Manchester, at a time when it was the world’s largest industrial city. He made a modest living tutoring, but spent most of his energy on scientific research, including into colour-blindness, a condition still sometimes referred to as Daltonism. That inquiry came to nothing. But during the first decade of the 19th century he took Proust’s concept and showed not only that elements reacted in fixed proportions by weight, but also that those proportions were ratios of small whole numbers. The simplest way to explain this—and indeed the way that Dalton lit upon—was to suppose each element to be composed of tiny, indivisible particles, all of the same weight. The Greek word for indivisible is “atomos”. Thus was the atom born.
Dalton based his system of relative atomic weights on hydrogen, the atoms of which he found to be the lightest. And it was quickly picked up by someone who, though less famous than Lavoisier, perhaps because of his grizzly end, was arguably the greater man. Jacob Berzelius, a Swede, furnished chemistry with its language. It was he who came up with the idea of the abbreviations that now occupy the periodic table’s rectangles. It was he who combined those abbreviations with numbers, indicating the proportions involved, to make formulae for chemical compounds: HO (water), HSO (sulphuric acid), NaCl (table salt). And it was he who used these formulae to describe reactions: HSO + Zn→ZnSO + H (sulphuric acid plus zinc becomes zinc sulphate plus hydrogen). Though Dalton invented atomic theory, it was Berzelius who embedded it at the heart of the subject.
And Berzelius did more. He used Alessandro Volta’s recently invented battery, which created electricity from a chemical reaction, to do the reverse. He employed electricity to drive chemical reactions in solutions (for example, releasing metallic copper from a solution of copper sulphate), a process called electrolysis.
Back in England, Humphry Davy, inventor of the miner’s safety lamp, picked up the idea of electrolysis and supercharged it. He employed a more powerful version of Volta’s battery to decompose molten materials, rather than solutions. In this way he discovered sodium and potassium in 1807 and magnesium, calcium, strontium, barium and boron in 1808. He also showed that chlorine, previously thought to be a compound of oxygen, was actually an element.
After Davy’s work new elements began to flow in thick and fast. Iodine (1811). Cadmium and selenium (1817). Lithium (1821). Silicon (1823). Aluminium and bromine (1825). By then there were enough of them for the next step on the journey to be taken.
It had been apparent from the time of their discovery that sodium and potassium were similar, as were calcium, strontium and barium. Lithium, when discovered, proved similar to sodium and potassium. Likewise, bromine and iodine proved similar to chlorine. In 1829 Johann Dobereiner, a German, noticed a curiosity about these trios (members of groups now known, respectively, as alkali metals, alkaline earths and halogens), and also another triplet that shared similar properties: sulphur, selenium and tellurium. In each case, if the members were arranged in order of atomic weight, the middle element (sodium, strontium, bromine, selenium) had a weight that was the average of the lightest and the heaviest of the three. Dobereiner called this the law of triads. It was the first hint of some underlying pattern.
The stamp collection continued to grow. Thorium was discovered in 1829 (by Berzelius, as it happened). Lanthanum followed in 1838, erbium in 1843 and ruthenium in 1844. Then, in 1860, Robert Bunsen, inventor of the burner that bears his name, showed how new elements could be recognised from brightly coloured lines in the spectra obtained when materials containing them were heated in a flame. This approach was an instant success. Bunsen and his colleague Gustav Kirchhoff added caesium (1860) and rubidium (1861) to the list. Others, copying them, added thallium (1861) and indium (1863). Spectroscopic analysis’s greatest triumph, though, was helium (1868). This was recognised not from a sample in the flame of a Bunsen burner but in the spectrum of the sun.
As more and more elements turned up, so the search for order intensified. In 1864 John Newlands, a Briton, almost got it. He published what he called the law of octaves. Arranging the known elements in order of atomic weight, he believed he had discerned that, like a musical scale, every eighth element “rhymed” in the ways that sodium rhymed with potassium, and chlorine with bromine.
The trouble with Newlands’ scheme was that an awful lot of the rhymes were forced. A glance at a modern periodic table shows why. For the tall, outer columns (and discounting hydrogen, which is a law unto itself) Newlands’ octaves work perfectly for the lightest elements then known. From the row beginning with potassium (K, from the Latin kalium, meaning potash), however, the tall outer columns are split asunder by the intrusion of ten other, shorter ones known as the transition metals. To deal with that intrusion using data then available required a mixture of luck and genius. And a few years after Newlands published, a lucky genius wrestled with the question in his study in St Petersburg.
Albert Einstein, dapper in his youth, cultivated a waywardness of appearance in old age that has contributed to the trope of the mad professor. Dmitri Mendeleev (pictured) looked like that from the beginning—having his hair cut just once a year by a shepherd, using wool shears. He also behaved like a mad professor. He was prone to dancing rages that put one biographer in mind of the protagonist of “Rumplestiltskin”, a children’s fairy tale. Also like Rumplestiltskin he proved, metaphorically at least, able to spin straw into gold.
For a time, Mendeleev had worked in Germany with Bunsen and Kirchhoff, but he had fallen out with them and returned home. In 1869 he was professor of general chemistry at the University of St Petersburg and was writing a Russian-language textbook on the subject. On February 14th of the Julian calendar then in use in Russia (February 26th by the Gregorian calendar employed in most of the rest of Europe), having addressed halogens and alkali metals, he was racking his brains for an organising principle to act as a template for the rest. The 14th was a Friday, and the problem obsessed him more and more over the weekend. But on Monday 17th, while waiting for a sleigh to take him to the railway station for a trip to an estate he had bought in the countryside, he had a brainwave.
Mendeleev was an inveterate player of patience. His brainwave was to recognise that, just as games of patience require the player to organise the pack as a grid of suits in order of the value of the cards, so the elements might be arranged by their atomic weights in “suits” that shared chemical and physical properties. By making his own pack, with each card representing one of the 63 then-known elements, he was able to embark on what was arguably the most important game of patience ever played.
He claimed subsequently that the answer had come to him in a dream. Perhaps. But after having worked for four days on the problem without much rest, the boundary between sleep and wakefulness must have been pretty blurred. Whatever the details, the result was a grid of cards that arranged the elements in a pattern (see picture). He published it two weeks later.
His grid was not perfect. Indeed, it was full of holes. But those holes (some of them, anyway) turned out to be keystones. Though there was no reason, in the 1860s, to believe that all the elements had been discovered, Newlands had behaved as though they had been. Mendeleev had enough confidence to leave gaps in order to make the pattern work. At the time, some took this as a sign of weakness. In fact, it was a sign of strength—the more so because, for several of the gaps, he described in detail the properties of the elements he predicted would fill them, and these predictions were, by and large, fulfilled.
Similarly, there are places in Mendeleev’s original table where it works only by cheating—that is, by swapping two adjacent elements between the places to which their atomic weights assign them. Here, Mendeleev argued that the accepted weights were incorrect, and needed re-measuring. Sometimes, he turned out to be correct about this, too. But not always. A few such pairs, cobalt and nickel for example (which actually share a slot in the published table), remained stubbornly out of kilter, providing evidence that atomic weight was really a proxy for some deeper structural principle
Crucially, Mendeleev was not constrained, as Newlands had been, by preconceptions about how things ought to be. At points where the octave rule did not work, he let the grid burst out of its corset. This can be seen at both the top and the bottom of the published table.
The upper-right-hand extension contains the transition metals. Here, subsequent discoveries have proved Mendeleev more or less correct in his insights. The lower-left-hand one is more problematic. Its contents are a grab bag, though it does contain all of the then-known members of the set of elements called lanthanides. Arguably, Mendeleev was lucky that by 1869 only three lanthanides had been discovered. In a modern table there are 15 and, together with the actinides below them, they form an awkward interpolation that is often relegated to the bottom as an asterisked footnote. Whether Mendeleev’s game of chemical patience would have been helped or hindered by having more lanthanides in the pack is an intriguing question.
There was also an invisible gap, the filling of which was one of the table’s greatest triumphs. Helium, which Mendeleev ignored because its atomic weight could not be established, turned out to be the lightest member of a whole, new row (or column, in a modern table). These are the noble gases, undiscovered previously because they are chemically inert. The others are neon, argon, krypton, xenon and radon.
Like Davy’s discoveries, the noble gases came all of a tumble. All but radon were the work of William Ramsay, a Briton. With various collaborators, Ramsay isolated argon in 1894, helium in 1895 and neon, krypton and xenon in 1898. Instead of chemistry, he used physical processes. All except helium were products of the newly developed technology of cryogenics, which he used to liquefy air and then separate it into its components, according to their boiling points. Helium, he found by heating a mineral called cleveite.
The 1890s also saw the first inklings that atoms themselves might not, despite the meaning of their name, be truly indivisible. The initial evidence that atoms could spin off parts of themselves, and must therefore have smaller components, came in 1896. That was when Henri Becquerel, who was investigating the nature of phosphorescence, wrapped some uranium salts in photographic paper and found that the paper got fogged. Thus did Becquerel discover radioactivity.
The following year, J.J. Thomson worked out that “cathode rays” emitted into a vacuum by a negative electrode were electrically charged particles that weighed far less than any atom. Then, in 1899, Ernest Rutherford, a former student of Thomson’s, showed that Becquerel’s radiation had two components, which he dubbed “alpha” (heavy, positively charged particles) and “beta” (light, negatively charged ones).
There’s antimony, arsenic, aluminum, selenium. And hydrogen and oxygen and...
Becquerel himself, in 1900, showed that beta particles were the same as Thomson’s cathode rays. Seven years later, Rutherford demonstrated that alpha particles were helium ions (thus incidentally explaining why cleveite, which is an ore of uranium, is also a source of helium). The stage was now set for some of the most important experiments in history: Rutherford’s attempts to find out what atoms looked like.
One previous guess had been that they were vortices in the luminiferous aether through which light and radio waves were thought to propagate. That hypothesis, however, died with the aether itself, when the latter’s existence was disproved experimentally in the 1890s. Rutherford’s experiments, conducted between 1908 and 1910, probed matter by firing alpha particles at gold foil. Most sailed through, to be recorded by a scintillation screen beyond the foil. But a few were deflected from their courses, to be recorded by other screens, including one behind the source. This screen’s recording of alpha particles returning whence they had come was described by Rutherford as being “almost as incredible as if you fired a 15-inch shell at a piece of tissue paper and it came back and hit you”. His explanation, now abundantly proved true, was that the atoms in the foil had tiny, positively charged nuclei, which were reflecting the positively charged alpha particles, and that these nuclei were surrounded by electrons.
Regardless of an atom’s exact nature, losing alpha and beta particles necessarily changes it. Such radioactive decay proved a source of yet more members of the periodic table. Polonium and radium—decay products of uranium—were found in 1898 by Pierre and Marie Curie. Actinium, the lightest actinide, followed in 1899. Radon was recognised in 1900. Protactinium in 1913.
Models of the atom also became more sophisticated. In 1913, Rutherford and a Danish colleague, Niels Bohr, suggested electrons orbit the nucleus as planets orbit the sun, with electrical attraction playing the role of gravity. In the same year Henry Moseley, another of Rutherford’s confrères, found a mathematical relationship between an element’s X-ray spectrum when bombarded with electrons and its atomic number in the table. In pairs like cobalt and nickel, where the table had been fudged, Moseley confirmed the fudges to be correct. He tidied up the lanthanides, predicting missing elements as Mendeleev had done. He also predicted two new transition metals, with atomic numbers 72 and 75, which duly turned up in 1923 (hafnium) and 1925 (rhenium).
Moseley’s X-ray spectra demonstrated that an element’s atomic number does not depend directly on its atomic weight. Rutherford soon showed that the atomic number is actually the number in a nucleus of a positively charged particle that came to be known as a proton. Even though protons weigh almost 2,000 times as much as electrons, the two have equal (though opposite) charges. An atom, which has equal numbers of both, is therefore electrically neutral. Protons are not, though, heavy enough to account for measured atomic weights. That requires a second, electrically neutral particle, the neutron. This was discovered in 1932. Neutrons are also the reason that an element can have atoms of different atomic weights, known as isotopes. These isotopes have different numbers of neutrons.
The Bohr-Rutherford model of the atom had a problem, though. Electrostatic forces should pull the electrons into the nucleus rather than keeping them in orbit. Here, the new science of quantum mechanics came to the rescue. Quantum theory requires objects to be both particles and waves. The wavelike aspect of electrons means that when they circle an atomic nucleus they settle into self-reinforcing three-dimensional standing waves, called orbitals. The stability of these standing waves stops the electrons being drawn into the nucleus. And here, at last, is the explanation for why the periodic table is the way that it is.
For reasons deep in the heart of quantum mechanics, each orbital can have either one or two electrons in it, but not more. The orbitals themselves come in different types (see diagram) and these are arranged in shells around a nucleus. The first shell has one type “s” orbital, for a maximum of two electrons. The second, a type s and three type p, for a maximum of eight. The third has one s, three p and five d, for a maximum of 18. The fourth, one s, three p, five d and seven f, for a maximum of 32. Et cetera. The names are derived from the spectral lines seen by Bunsen and his followers. The colours of these lines represent energy released as light by electrons moving between orbitals.
It is the shells that define the table’s rows. In the first row, which consists of hydrogen (one electron) and helium (two), the first shell is filled up. In the second row, from lithium to neon, the second shell is filled. The third row, from sodium to argon, fills the s and p orbitals of the third shell. The fourth, from potassium to krypton, fills the s and p orbitals of the fourth shell and the d of the third shell (which has ten electrons altogether, for the ten columns of transition metals).
Compounds are created either by unpaired electrons from different atoms forming joint orbitals called covalent bonds, or by the complete transfer of unpaired electrons between atoms, to create paired orbitals in the recipients. When this happens, the resulting positive and negative ions are held together by electrostatic forces—a process called ionic bonding. The repetitive order in which the shells are filled in each row means that elements in each column of the table have the same combination of unpaired electrons, and thus similar properties. For example, the noble gases are inert because they have no unpaired electrons. Further analysis showed, moreover, that the difference between metals and non-metals depends on how easy an atom’s outer electrons are to detach (if easily detached, they can flow as an electric current, reflect light in the way that makes metals shiny, and confer ductility on the solid form of the element). And that, essentially, is chemistry solved.
It is not quite, however, the end of the story. In the 1930s physicists discovered that radioactivity could, in essence, be reversed by bombarding atoms with subatomic particles to increase their atomic numbers. This way, new elements can be produced. Technetium, created in 1937, was the first such. Two years later francium, the last to be discovered in nature, was isolated as a decay product of actinium. From that moment the extension of the periodic table became work for physicists, not chemists.
Technetium is strange. Despite its low atomic number (43) it has no stable isotopes, and is thus found only transiently in nature. This is a quirk of the physics of protons and neutrons that it shares with promethium (61). But at the heavy end of the table, beyond lead (82), radioactivity is compulsory for all. And beyond uranium (92) it is so compulsory that “transuranics” were once thought not to occur in nature.
This part of the periodic table was the playground of Glenn Seaborg, an American physicist. In 1940 Seaborg was part of a group at the University of California, Berkeley, that made neptunium (93). When the group’s head left later that year, Seaborg took over. On his watch americium (95), curium (96), berkelium (97), californium (98), einsteinium (99), fermium (100), mendelevium (101) and nobelium (102) were all created. But his first discovery, plutonium (94, in 1941), was the most important. On July 16th 1945, the first atom bomb, a plutonium-implosion device, was tested at Alamogordo, New Mexico. On August 9th of that year another of the same design destroyed Nagasaki, in Japan.
Americium has its uses, too. Since it was a synthetic product, it was patentable, and Seaborg did, indeed, patent it. It was (and is) employed in smoke detectors, and he drew a tidy income from that fact for many years. Beyond 95, though, the practical point of extending the table became less and less obvious as elements became less and less stable.
Efforts to make new elements slowed down after 1955, though there was a pick up again in the mid 1990s. Neither chemistry nor the wider world, however, reverberated with excitement at the creation of darmstadtium (110), roentgenium (111), copernicum (112) and nihonium (113) in the way that they had with the discovery of potassium, or helium, or radium or plutonium. What started as stamp collecting has returned to its roots—except in one regard. This is that, thanks to Mendeleev’s brilliance, element-hunters now have an album in which to stick their discoveries.
The heaviest element of all, oganesson (118), was created in 2002, though named only in 2016. Oganesson completes the table’s seventh row. Chemically, it should be a noble gas. But, with only a few atoms of it to play with at a time, and with those atoms having lifetimes measured in milliseconds, it seems improbable anyone will ever know for sure.
Despite physicists’ best efforts, then, the eighth row has not been reached. But as Mendeleev himself said, “To conceive, understand and grasp the whole symmetry of the scientific edifice, including its unfinished portions, is equivalent to tasting that enjoyment only conveyed by the highest forms of beauty and truth.” For those who share this view, and see in the periodic table a supreme example of nature’s poetry, the row-completing, album-filling addition of oganesson may seem as good a place as any to stop.
HackerNewsBot debug: Calculated post rank: 104 - Loop: 136 - Rank min: 100 - Author rank: 45
A safety director issued a public warning that buckets of ore may have exposed employees and tourists to unsafe radiation. But being near natural uranium ore is unlikely to cause an unsafe dose.
Article word count: 768
HN Discussion: https://news.ycombinator.com/item?id=19203911
Posted by brundolf (karma: 3798)
Post stats: Points: 126 - Comments: 71 - 2019-02-19T23:11:13Z
#HackerNews #around #buckets #canyon #for #grand #had #museum #sitting #uranium #years
The safety director at Grand Canyon National Park says people may have been exposed to radiation from three buckets of uranium ore that sat for years in a museum collection building. Whether the amount of exposure was unsafe has not been determined.
Rhona Wise/AFP/Getty Images
For many years, three buckets full of uranium ore sat in a museum building at Grand Canyon National Park. Tours often visited the museum collection building, with children on tours sitting next to the buckets for a half-hour.
Recently, the parkʼs safety, health and wellness manager, Elston "Swede" Stephenson, sent out an email to National Park Service employees and approached the Arizona Republic to warn that people in the building were "exposed" to radiation.
Whether that proximity was unsafe has not been determined. Simply being near uranium ore is unlikely to result in an unsafe dose of radiation.
"Uranium can be harmful to peopleʼs health depending on the amount and grade of ore, how people interact with it and the exposure time," Jani Ingram, a professor of chemistry and biochemistry at Northern Arizona University, tells the Associated Press.
But, she says, "You canʼt say, ʼOh my gosh, all those kids are going to develop cancer in five yearsʼ because you just donʼt know how close they were, how long they were there," she said. "But that open bucket was probably the most concerning. It seemed that maybe whoever it was didnʼt understand what they had."
Stephenson sent an email to employees on Feb. 4, warning that if they had gone in the museum collections building between 2000 and June 18, 2018, "you were ʼexposedʼ to uranium by OSHAʼs definition."
How was the uranium discovered after all that time? In March 2018, the teenage son of a park service employee had a Geiger counter that detected radiation in the collection room, Stephenson said. The buckets had apparently been in a basement for decades before being moved to the museum.
Stephenson told the Republic that he immediately contacted a park service specialist to report the uranium ore. A few days later, technicians arrived.
Photos provided to the newspaper by Stephenson show technicians arriving in June 2018 to take away the buckets of uranium ore. The technicians reportedly dumped the buckets at an old uranium mine 2 miles away, then for some reason brought the buckets back to the building.
Stephenson said the park didnʼt do anything to warn workers or tourists that they had perhaps been exposed to unsafe levels of radiation, despite a Right to Know law that he said requires disclosing the incident.
"My first interest is the safety of the workers and the people," he told the Republic. He is especially concerned about kids who were potentially exposed to radiation, at levels he calculated to be 1,400 times the Nuclear Regulatory Commissionʼs safe level for children.
Dennis Wagner, the Republic reporter who broke the story, said that Stephenson approached the newspaper to get the word out to the public after his efforts to get the park service to warn the public went nowhere.
Grand Canyon National Park Public Affairs Officer Emily Davis said that the park service is coordinating an investigation with the Occupational Safety and Health Administration.
"A recent survey of the Grand Canyon National Parkʼs museum collection facility found radiation levels at background levels — the amount always present in the environment — and below levels of concern for public health and safety," Davis said in a statement to NPR. "There is no current risk to the public or park employees. The museum collection facility is open and work routines have continued as normal. The NPS takes public and employee safety and the response to allegations seriously. We will share additional information about this matter as the investigation continues."
OSHA confirms to NPR that it has opened an investigation into the matter.
This isnʼt Stephensonʼs first time raising alarms about a dangerous working environment, the Republic reports:
"Stephenson, a military veteran who is certified as an occupational safety and health technician, was in a similar controversy during his time in the Navy. According to court records, he began calling for action to prevent falls after a series of accidents in 2016. "As complaints escalated, Stephenson was fired. He turned to the Office of Special Counsel, a federal agency that protects whistleblowers, and his termination was stayed. It is unclear how that case was resolved, but within months, Stephenson had a new job with the National Park Service. "Stephenson said the uranium exposure saga developed while he was pursuing a racial-discrimination complaint with the Equal Employment Opportunity office. Stephenson is African-American."
HackerNewsBot debug: Calculated post rank: 107 - Loop: 362 - Rank min: 100 - Author rank: 55
- #images #photos #foto #photo #photography #photograph #photographs #newsphotography #worldnews
The acclaimed Montreal troupe are on tour to celebrate a quarter-century of astonishing shows, performed for more than three million people around the world. See how their motley crew of acrobats, clowns, jugglers and musicians have shaped contemporary circus
- #images #photos #foto #photo #photography #photograph #photographs #newsphotography #worldnews
From tossing paper towels to hurricane victims in Puerto Rico to rallies to Kanye, here are some of the images of the first half of Trump’s term of office
- #images #photos #foto #photo #photography #photograph #photographs #newsphotography #worldnews
From tossing paper towels to hurricane victims in Puerto Rico to rallies to Kanye, here are some of the images of the first half of Trump’s term of office
- #images #photos #foto #photo #photography #photograph #photographs #newsphotography #worldnews
Australia’s most prestigious sporting event hits a landmark birthday this year after half a century of tennis
- #images #photos #foto #photo #photography #photograph #photographs #newsphotography #worldnews
Australia’s most prestigious sporting event hits a landmark birthday this year after half a century of tennis