Hacker Newsnew | past | comments | ask | show | jobs | submit | ajax77's commentslogin

This simply isn't true. A number of years ago with the introduction of DirectX 10, Autodesk began transitioning several mainstay products to exclusive DirectX use under Windows (see http://archicad-talk.graphisoft.com/files/autodesk_inventor_...) Morevoer, Autodesk has moved to using a modern, unified renderer for a large number of their products. There may be other CAD companies "holding up" the "monstrosity" of dated OpenGL versions, but Autodesk is not that company. In all likelihood, the issues lies not with the CAD companies, but with their clients that refuse to upgrade their legacy applications of 10-15+ years age.


Ok so my information must be outdated, apologies, I read about this a few years ago.

OGL still needs to get rid of the legacy fixed unit pipeline and standardizes the interfaces for more advanced functionality (like OGL ES 2 and DX10).


Ever heard of the Core Profile?

If you request an OpenGL context from your OS and ask for a core profile context, you'll get a context that's almost identical to ES2. It's shader based and all of the legacy fixed stuff is entirely gone.


Yeah, no. Kinect uses an IR emitter with an IR-based depth camera, there's nothing stereo about it. A regular RGB camera adds color, but that is all.


Not to nitpick, but there is a stereo aspect to it. The IR projector is offset laterally from the IR camera, and this is critical to evaluating the depth, because the disparity of the IR dots is more extreme the greater the distance between the emitter and the camera.


It is actually exactly stereo, except one of the cameras is replaced with an IR projector; you basically do the standard stereo math as though the projector is a camera 'seeing' the image that is being projected.


Pedantry isn't so bad, if it's correct... As was stated, the term "source codes" is very common in many communities, particularly scientific computing and associated academic circles. The only credibility lost here is... well let's just move on here, shall we?


That's simply not the case.. In general, the "no penalty" holds true for primitive data types, and when passing an r-value instance of a class that has a move copy constructor defined. In the case where your object is an expensively constructed l-value, it is preferable to pass by const reference rather than value (assuming you plan to use the object later... otherwise you could indeed pass using std::move semantics).


In C++11 you are supposed to use move semantics in your own classes. I'm not 100% sure but I think the standard require that the STL use move semantics.


Yes but this is one more thing you have to think about when implementing a class in C++ and one more chance to screw it up.

It makes vanilla C and raw pointers look a lot more appealing, to be honest.


Actually I think C++11 is a simpler language than C++03 or C++98. The problem is that we need to un-learn what we know and learn the new way of doing things.


C++11 is simpler for the casual user of classes but I'm not so sure it's easier if you're implementing a lot of classes yourself. I disable assignment and copying in all my classes by default unless I have a really good reason to implement them and I still pass by const & for types I define myself.


With the niceties of C++11, it's certainly become a lot less painful; lambdas, auto (both variable and return type), variadic templates and r-value references give the language just enough flexibility to allow functional style with far more natural syntax. To be sure, it's not a perfect setup; type traits (e.g., enable_if) in function signatures are ugly and hardly a replacement for concepts.

I've been dabbling in the creation of a functional library that provides a modest subset of Prelude-inspired features from Haskell. https://github.com/jdduke/fpcpp Many of the algorithms are simply wrappers around stl algorithms, hiding some of the syntactic awkwardness attendant with the heavy use of iterators. Some day I'll get around to playing with monadic devices, and integrating the list comprehension work that I've done recently.


A slight variation with neither error codes nor exceptions.

  template< typename T >
  auto findRoots(T a, T b, T c) -> pair<maybe<complex<T>>, maybe<complex<T>>> {
    typedef complex<T>       C;
    typedef maybe<C>         Root;
    typedef pair<Root, Root> Roots;

    if (a == 0) {
      if (b == 0) 
        return Roots(nothing,      nothing);
      else
        return Roots(Root(-c / b), nothing);
    } else {
      auto d     = C(b*b - a*c*4);
      auto two_a = a*2;
      return Roots(Root((-b + sqrt(d)) / two_a), 
                   Root((-b - sqrt(d)) / two_a));
    }
  }
  
maybe is simply a templatized std::pair wrapper that lets you check if the returned value is valid (either via maybe.valid() or if(maybe)). All imaginary results are handled automatically via std::complex. nothing decomposes into an invalid maybe value of the appropriate type.


The minor nit first: the original specification did not support complex numbers, so adding them here is even less relevant than my return codes.

More importantly, your implementation is numerically less correct than my version, which itself is less correct than it should be.

Consider when b is near sqrt(d). In that case, b-sqrt(d) can lose precision. That's why I used the copysign function, so that I'm always adding two numbers of equal sign and size.

Mine is incorrect if b2 is near the size of 4ac since there too b2-4ac loses precision. Ideally this intermediate result should be done in quad precision if the input is in double.

Question: How does C++ handle copysign() (vs. copysignf() for floats), and support templates which want to use an higher precision intermediate value?


My reply wasn't meant as a numerically superior solution, simply as a fairly simple alternative without error codes...

As for "original specification", I'm not sure why you even bring that up. complex is certainly in the C++11 specification, against which my snippet is compliant.


Oh, I see I didn't explain well enough. I'm curious about how one would write the "numerically superior solution." That is, does C++11 have a templated copysign function which takes different numeric types (at least float, double, and quad)? If not, then there's some unneeded type promotion (or downcast) going on.

And if the template uses a float, how do I get the type with double precision to use as my intermediate? (And the same where the template uses a double and I want the intermediate to use a quad.)

There must surely be a way to handle these, but I haven't done C++ programming for over a decade and I don't know the modern way of doing things.

As regards "original specification" - I mean the original article from feabhas.com, which has since disappeared. As I recall, it only supported real roots, and not imaginary ones.

BTW, _Complex is also in C99.


It should be noted, it's very clear his executable sizes are for a Debug compilation. I tossed the same code for the final example into VC10, and sure enough in release mode the compiled size is 10KB, and in debug it's just less than the size he stated. This is for 64-bit.

I'm not sure what point it serves to compare debug mode executable sizes, but that should be clearly stated in the article. As other people stated, the final example really ought to lose the wrapping class in the spirit of C++11 brevity and functional decomposition.


Whether or not you are a C++ fan, templatized algorithms ARE often faster than their C-based, generic counterparts. Any decent C++ compiler should be able to inline generic code while C often relies on function pointers. I'm not saying this is always the case, but often. std::sort beating qsort is a typical argument, though I'm not sure the two algorithms are equivalent excepting language-specific constructs (http://stackoverflow.com/questions/4708105/performance-of-qs...)

As for size, you're absolutely right, though for many scenarios that appears to be less and less a pressing concern. Perhaps the biggest drawback is compilation time.


Any decent C++ compiler should be able to inline generic code while C often relies on function pointers.

What makes you think that a decent C compiler can't inline runtime-generic code using void* and virtual functions?

Sure, erasing types only so that the optimizer has to figure the information out again using constant propagation is far from optimal, but the underlying issue is actually source code inclusing vs modular compilation -- most of the speed gain of templates comes from the fact that it only works with the former, whereas C-code traditionally does the latter.


I agree with what you're saying but C can be persuaded to make template like code:

http://attractivechaos.awardspace.com/ksort.h.html


Compilation time is becoming less of an issue as well with compilers getting faster and faster. For example the same codebase using GCC 4.6 and clang 3.0, clang is almost 20% faster at compiling the same codebase.

Even so, making heavy use of templates I have not yet found any major issues or that compile time has increased so drastically that it became a concern for the project.


Thankfully you're right; compilers have come a long way and Clang in particular has done a lot for easing the pains of template programming in a number of ways.

I'm knee deep in the development of several header-based, fully templated libraries, so I still certainly feel the pains of compilation times. And while the tests/samples I deal with tend to be somewhat extreme in their (ab)use of templates, I think you'll still find that the average Boost user would give a lot for yet speedier compilation.


std::sort is stable, IIRC. Otherwise, they are similar; and yes, std::sort is typically faster.

Do note that instruction caches are not unlimited, though: bloated code does have a performance cost (sometimes, caches are hard.)


std::stable_sort exists and is the stable one.


I can sympathize; I too use C++ by day, and have become increasingly bitter/jealous/frustrated as I've spent more time with Haskell by night.

To the point that, I've started writing several libraries that enable in C++ several of the features I miss most in Haskell. In particular, I've missed the natural syntax of higher order functions, e.g., currying and functional composition, as well as the pithy expressiveness of Prelude vs the STL. With C++11, we can have these things, and while Boost gives us a wealth of functionality, it's hardly the most natural (and lightweight) library.

fc (functional composition) (https://github.com/jdduke/fc) and fpcpp (functional programming with C++)(https://github.com/jdduke/fpcpp) are my recent (very much work-in-progress) efforts towards this end.

With fc, one can naturally compose functions, lambdas and function objects, e.g., given composable functions f, g, h, write auto fgh = f + g + h; or, auto fgh = compose(f, compose (h,g));, and then call it as fgh(a,b,c);

fpcpp enables syntax like

  let pi = [](unsigned samples) -> double {
      typedef std::pair<double,double> point;
      let dxs = map([](const point& p) { return p.first*p.first + p.second*p.second; },
                    zip(take(samples, rand_range(-1.0,1.0)),
                        take(samples, rand_range(-1.0,1.0))));
      return 4.0 * filter([](double d) { return d <= 1.0; }, dxs).size() / dxs.size();
  }
  EXPECT_NEAR(pi(10000), 3.14);
In writing fc and fpcpp, I've actually become less and less concerned for the future of C++; having used a good number of the C++11 features available, I've regained a good deal of my passion for C++ programming. Really, C++ needs a more expansive and natural standard library, iterators are powerful but are NOT the answer.


You highly doubt a revolutionary figure said a radical thing because it would have been considered impolite/revolutionary/radical?

Newton was in a world of his own; he, along with a few select others (Lock, Descartes and others), are in large part responsible for the Enlightenment. The very Enlightenment this statement characterizes so well. Even if he didn't say it (I cannot find the source, but I have read this quote a number of times in various papers), it is very much the kind of statement he would make.


No, it isn't, for the same reason Newton didn't wear his hair like Elvis. No one invents his own culture. This has nothing to do with how radical Newton's ideas were or how great a genius he was. It has to do with what cultural forms existed at the time. You have to get to figures like Byron before this kind of statement makes sense (and my bet is that this one comes from later even than that).

However, I'm no scholar of the 17th century. Prove me wrong. If Newton said something that sensational, it won't be hard to track down. I didn't search for it like I usually do before making these claims, so your job should be easy.


Hard work as a means to success is hardly a post-romantic invention. Newton was very much a student of Aristotle, the teachings of whom we can see very much aligned with the statement in question.

In that light, would you also doubt these verified Newtonian quotes that suggest similar thinking? "If I have seen further it is only by standing on the shoulders of giants." "Truth is ever to be found in simplicity, and not in the multiplicity and confusion of things." "If I am anything, which I highly doubt, I have made myself so by hard work."


The first quote is proverbial, and Newton certainly said it, but the phrase dates from the 12th century (http://en.wikipedia.org/wiki/Standing_on_the_shoulders_of_gi...) and is not part of Enlightenment tradition. Quite the opposite, in fact: it comes from the medieval tradition of abasing yourself before the ancients. So no, it doesn't resemble the disputed quote at all.

The second is not familiar to me, but it is 17th century language and does sound like something Newton would say. But it's irrelevant here. He's talking about nature, not himself.

The third is much closer to the disputed quote. But I don't believe Newton said this either. You know who said things like that? Horatio Alger. So let's see a textual source in Newton's works before accepting it as evidence.

Here's a helpful trick. When you Google a quote and the first page consists entirely of junk like this:

http://www.google.com/#sclient=psy-ab&hl=en&source=h...

... that's a sign that the quote is bogus.


I looked for the "sign that the quote is bogus" but missed it. Maybe results that have 5 million hits are bogus? Maybe results that have relevant links are bogus? I can enter all sorts of valid quotes from both current and historic figures into Google, and obtain very similar results. ("Ask Not What Your Country Can Do For You", "A man should look for what is, and not for what he thinks should be", "A man who dares to waste one hour of time has not discovered the value of life.") I fail to see how that adds anything to the discussion. Neither do I see how the fact that Newton restated, rather than originated, a particular idea, helps your argument.

Also, please inform me as to how this statement relates to Lord Byron. Nothing I've read of Byron would favor him as the statement's originator over Newton. I can't help but think of this scene from Good Will Hunting when I read your replies http://www.youtube.com/watch?v=ymsHLkB8u3s&t=1m56s


Hey, this isn't a competition. We're just talking about an interesting historical question.

Maybe one point deserves clarification.

When you Google an authentic quote by a very famous person, a precise textual citation is usually locatable through one of the top results (or something it links to). Therefore, when you Google a quote by a very famous person and nothing but quotespam sites come up, the quote is probably bogus.

I've never seen an authentic quote that fails this test. If anyone can find one -- that is, find a quote by a very famous person, the first page of Google results for which is all quotespam, but which nevertheless is an authentic quote as proven by a real textual reference -- I would like to see it.


Searching books.google.com for the quote is very useful too.


Indeed it is, especially with the inauthor tag, as in:

  inauthor:"Isaac Newton"
Google Books is an amazing resource. Would there were a way to get the full text of everything.


> If I have seen further it is only by standing on the shoulders of giants.

Isn't this quote Newton being typical Newton and using this well-known phrase to be particularly nasty to Hooke? (Hooke was a hunchback, so Newton is saying he did not get insight from Hooke.)

I have read a few biographies of Newton and in some Hooke is a colleague albeit competitor and in others is a much-disliked rival. I don't really know which to believe.


My understanding is that some people think that Newton meant it as a nasty swipe and others disagree. But if you've read a few biographies of Newton then you know a lot more about this then I do.

Edit: there's a hilarious story about Freud that revolves around the quote. Freud was angry that one of his acolytes - I think it was Wilhelm Stekel - had published a book in which he had presumed to modify one of the master's ideas. Stekel defended himself by saying: A dwarf standing on the head of a giant sees a little further than the giant. Freud replied: A louse on my head sees no further than I do.


Newton was a strange mix of the modern and the ancient. Remember that he devoted most of his working life to compiling a chronology of the old Testament and to summarising various alchemical works. See Westfall's biography Never at Rest.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: