Hacker Newsnew | past | comments | ask | show | jobs | submit | 25cf's commentslogin

I hope I'm not putting myself on a list by posting this.

Anyway, the comments lead down a pretty dark rabbit hole about Assange's whereabouts since they cut off his internet access...


>Finally, any even number divided by two is still even and approaches two.

This isn't true.


These people are either being intentionally deceitful or are simply incompetent in understanding the full implications of what they're asking. Either possibility is disturbing.



So they know this leads to an Orwellian state and they do not care? Maybe, but it would be political suicide to admit that.

Certainly Comey is being deceitful in his first sentence. Even Hillary called for a Manhattan-like project to circumvent encryption back in December [1]. That shows it's something that's being discussed in Washington quite frequently.

I doubt any will ever admit they're trying to lead us towards an Orwellian state. Also, encryption is part of our protection from that. Heads of state and encryption currently appear to be directly at odds.

[1] http://www.cbsnews.com/news/democratic-debate-transcript-cli...


I think the amount of unnecessary emotional appeals makes it fairly clear that they're being intentionally deceitful.


Any time an American official or politician says the word "terrorist", my mind simply shuts off. I know ever subsequent word will be rhetorical nonsense and bluster. That's pretty sad.



The main feature that I miss from CLion is the ability to refactor. In CLion you can rename a class, and all of its usages, declarations, etc. will be renamed.

The most useful thing CLion can do is change the signature of a function, and its corresponding implementation and header declarations will be changed. So, you can rename a parameter and its declaration and implementation will be renamed for you, or you can delete a parameter, or change the type of a parameter, and CLion will change all its usages/definitions/declarations for you. All of this effects across all files in your project, so if you're in a .cpp file and you change the function signature, its declaration in the corresponding .h file will change as well.

(I believe you can do all of this in Eclipse as well)

Sadly I have yet to see a vim/emacs configuration that gets anywhere near the level of Eclipse/CLion.


>and all of its usages, declarations, etc. will be renamed.

Along with header include statements in any file that referenced the class - so say I refactored class Foo in include/mylib/foo.hpp and it's included in src/mylib/foo.cpp as <mylib/foo.hpp> - CLion happily "refactors" that to #include "foo.hpp" - spent the same ammount of time fixing this stuff as I would manually renaming use cases...


As a new IDE user, I was very excited about the language-aware "rename" refactoring ability. But I realized that the variables I rename are mostly badly chosen names in IDE code-generated boilerplate, i.e. names of widgets inserted into forms. When I'm writing new code in vim I just choose good names to begin with, so it's a bit of a wash for me.

I hope there is room in my brain to stay good at vim despite using VS at work.


Not a biologist by any stretch of the imagination, so I am asking purely out of ignorance - how does your study debunk the OP's? Yours is about NSAIDs (i.e. aspirin) decreasing the risk of Alzheimer's, OP's is about paracetamol (not an NSAID) increasing the likelihood of Alzheimers. Correct me if I'm misunderstanding something here...


It found aspirin most likely has a protective effect.

It found no strong evidence of any association for acetaminophen, but the 95% confidence interval on the result is pretty wide. So it doesn't entirely rule out the possibility of there being a relationship, and based on its results alone it's entirely plausible that a test with more statistical power could find that it doubles your risk of Alzheimer's disease. But it could also plausibly find that it has a mild protective effect.

Regardless of which is the case, this study still strongly challenges what's being suggested by TFA. Even a doubling of Alzheimer's risk resulting from a lifetime of acetaminophen use is nothing at all like its suggestion that Alzheimer's is a disease that was unheard of before the development of NSAIDs related to Tylenol.


> its suggestion that Alzheimer's is a disease that was unheard of before the development of NSAIDs related to Tylenol.

Minor point: Tylenol isn't an NSAID, nor is the prior drug to which the article links Alzheimer's.


GP's study includes PA (which is also known as acetaminophen) as well as NSAIDs, and finds no effect, so it is a contrary result. Calling it a preemptive debunking may be a bit excessive, but given that it is a more direct study of the relationship, it (from the abstract, at least) seems to be a reason for skepticism.


I think perhaps you didn't read it carefully enough:

"In addition, we examined use of acetaminophen...No association was found between AD risk and use of acetaminophen"


"Not a biologist by any stretch of the imagination"- a problem with the software engineers on Hackernews pontificating about other subjects.


This is quasi-ad hominem, and yet there's a kernel of truth to it, i.e. being a software engineer doesn't necessarily make you a critical thinker.


"I am asking purely out of ignorance ... Correct me if I'm misunderstanding something here"

How do you think 25cf should have handled this differently?


Just ask the question. The preamble and last sentence only detract from the main point: how does your study debunk the OP's? Yours is about NSAIDs (i.e. aspirin) decreasing the risk of Alzheimer's, OP's is about paracetamol (not an NSAID) increasing the likelihood of Alzheimers.

If you're misunderstanding something, don't worry, someone will set you straight.


I think this is going in the wrong direction. Tactile feedback is so crucial for keyboards and this just completely removes it. In my opinion, technologies like the morphing touchscreen keyboard[1] are the future.

[1] http://tactustechnology.com/


Neither. Truth is we are a generation that grew up using keyboards. The one coming after us will consider normal to use an onscreen keyboard or voice recognition. This is a problem that will solve itself.


No. Humans depend on feedback loops to coordinate their fine motor movements. Otherwise absolute positioning accuracy is bad. Even an on-screen keyboard provides such feedback. Typing in the air doesn't.


So type on the table?

Seriously, wouldn't that be the same amount of feedback as a touchscreen keyboard? Flat, and you get the wrong letter when you miss?


No, a table gives no feedback whatsoever in the manner that he's referring to at least. When you hit a key on a keyboard the key sinks into the board, you have a positive 'strike'. If you miss the key slightly then it feels different and you'll know before the wrong letter appears - this is what is meant by a feedback loop.


That's off-topic. We're talking about touchscreen keyboards, as is very clear in my post, and we won't pretend their keys sink.


With an on-screen keyboard, you get visual feedback seeing your fingers land relative to the keys, even if you don't give tactile feedback. With this, all you see is whether you got the letter right.


I don't understand where you and I disagree. That "no" seems confusing.


> This is a problem that will solve itself.

> Even an on-screen keyboard provides such feedback. Typing in the air doesn't.

Probably there.


Voice is a separate issue, but this has the same problem as a touchscreen. We are very sensitive to touch. It's quite possible that with the proliferation of touchscreens, children may find increasingly less tactile feedback in the world around them (due to various other reasons - less play time, sterilized playgrounds, etc) and we could end up with a generation of adults lacking the same tactile ability we have today.

So I don't think the problem solves itself, it just compounds.

Much like our loss of movement ability, it seems to become less relevant, until you notice that we've lost an essential part of what makes us human.

Its about time computers started fitting humans and not the other way round.


> Its about time computers started fitting humans and not the other way round.

This is a really interesting argument and has got me thinking. I have to say that I agree with your sentiment. The problem is that I think we've entered into a kind of feedback loop; a compounding problem as you described it.

Technology and intelligence have allowed us to rapidly accelerate our fitness while simultaneously and subtly forcing us to become increasingly dependent upon them for continued advancement. In short, tech informs the ways in which we progress and, in the interest of further progress, our new (less human?) norms inform the progression of tech.

I think there are examples of this all over the place: infants expressing confusion at a screen that is not touch sensitive, memes and texting idioms seeping into spoken conversation, or people asking their device to call someone and intentionally mispronouncing the name the same way the device does so that it "understands."

But I also disagree that tech can start fitting humans in the sense you describe, because I simply don't think that's possible anymore.

There's a post on the front page right now discussing self-experimentation with black widow bites, a comment from which [0] strikes me as relevant to the argument I'm about to make. To quote, "[humans]...are pretty weak creatures all things considered." A lot of cutting edge tech today is focused on AR and mobility. That's because, as the quoted comment notes, humans are weak; a weakness which also extends to our senses. Touch seems so fundamentally human, yet a computerized sensor can tell us so much more than a fingertip about a surface. It sounds terrible when I write it, but there it is: information is powerful (addicting?).

It's impossible for us to conceive the kinds of situations humanity will face in the future, but after a certain point the limitations of the human body will have to be addressed. Right now that means computers.

So, I suppose what I'm saying is that, imho, computers actually are fitting "humans," it's just that they are fitting what we have, will and must continue to become rather than what we once were. And that's probably a good bad thing (rather than bad bad) in the long run.

[0] https://news.ycombinator.com/item?id=7860997


Computers are fitting humans, thats why they are now a thin slate you can easily carry around.

And there is a feedback when you touch a glass surface, it's just it doesn't give you a non-visual clue that you have pressed a letter and that you have grown accustomed to.

I don't think that's a dealbreaker for the generation that is growing with tablets. I certainly don't see my teeneage cousins or nephews/nieces worried at all about using keyboards. If they have them around they might use them but they type fast on the screens as well. Most of them no longer have a laptop or desktop anyway. I've seen them writing whole essays on the phone while lying on the bed.

Talking about lost feedback, think about what a huge step back was for our grandfathers generation to abandon handwriting and switch to pressing keys on a typewriter. I'm pretty sure we could have argue back then about how little feedback you get from a key vs. the flow of the hand and the friction of the paper when you write. And somehow we adjusted so well that most of us are probably incapable of writing a handwritten letter nowadays. We are not "less human" because of this.

We will survive.

And note I'm not trying to say that this is the new status quo and we should all accept it and move along. I'm sure we will see a lot of different approaches to solve the issue in the future. What I'm saying is that the pool of users these solutions speak to is diminishing. We see this ideas and think that they'll appeal to everybody who uses a tablet. Most likely they'll be only used by a niche market. Nothing wrong with that but definitely something to take into account when trying to launch a company around them.


What about productivity? A professional programmer, blogger, etc. needs to create huge amounts of content.

Trouch-screen devices are good for content consumption but absolutely unfit compared to computer for content creation.

People are creating high quality keyboards[1] because there is an actual market :-)

[1] http://codekeyboards.com/


That's where machine learning plays a role. It learns what movement corresponds to a particular letter. If that movement changes over time, it learns that too.


I wonder if these would actually be a feasible mode of transport if people started using skii poles with them to maintain balance. I'd personally like to test that idea out but I don't have an AirWheel.


> accepting things as they are

really? as opposed to innovating?


Are you attempting to do a Cliff's notes of my comment?


I don't find it hurtful.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: