- From: Matteo Collina <matteo.collina@gmail.com>
- Date: Fri, 11 Oct 2013 12:06:23 +0200
- To: public-rdfjs@w3.org
- Message-ID: <CAANuz54HdTGDjT94k6Dt3dNzejDuUgD146q=k6wqibAjLvqUpw@mail.gmail.com>
Hi Austin, 2013/10/11 Austin William Wright <aaa@bzfx.net> > I haven't been impressed with the Node.js ecosystem. Not that there's too > much choice, but that there's no guarantees of anything. > Even big companies (Microsoft, Oracle, or Google) do not guarantee anything. You are the ultimate maintainer of your software. You are responsible for keeping it working. This might be a philosophy of life :). > Everyone releases 0.x versions of software with no indication of breakage, > or entire packages simply go out of support, or you end up using two > different packages that do the exact same thing, or two different versions > of the same package, or two different instances of the same version of the > same package (because npm insists on checking out multiple instances of the > same package). It's a nightmare. (I'd like to move to an RDF-based package > manager, generic enough for an entire OS, but that's another conversation > to have.) > You are definitely wrong. Being able to have two instances of the same library in the same namespace is best feature of a package manager. Anybody who has worked extensively with Java (or Windows 95) knows how bad it is the situation where two libraries requires different versions of the same dependency. NPM is smart enough to reuse the same version of the library if it is fine for the depending packages. As for fragmentation, time will help with it: the community is very young and we are still searching for the best way for solving some problems. We should definitely take the time to index functionality of the various > libraries that exist, after first indexing the common use-cases that these > libraries will be used towards. > > Perhaps after this, there are two things I would suggest: > > 1. Publishing specifications, so we can define the functionality we want > in terms of what would be best for developing. (Libraries are supposed to > benefit the developers using them, not necessarily the developer writing > the library.) > > 2. Changing and specializing the functionality of the libraries > accordingly. > I completely agree. You write about being asynchronous, but I'm not sure what you're referring > to exactly. There is no need to have an asynchronous API; as there is > nothing waiting on I/O operations. Likewise, "backpressure" isn't a well > defined term... I believe "congestion" is what you mean? But that's not > something your library would be concerned with, only the application author > would be (let's say they're reading from the filesystem and writing to a > database, in this case the application author would implement congestion > control because they're the one doing I/O). > You have not understood much about node.js. The whole asynchronicity, streams and backpressure are the core of Node.js. To recap: 1) if you want to process a file while you read it, without storing it all in memory, then you HAVE to use streams. This is one of major design decision Ruben was talking about. 2) backpressure is a concept related to streams and asynchronicity. You have two streams, A piped into B, so the data is flowing A --> B. If B cannot process the data at the speed A is generating it, backpressure kicks in and A slows down. (reference https://github.com/substack/stream-handbook) The WebApps WG is currently discussing a streams API for e.g. > XMLHttpRequest. We should look into this - the Node.js Streams API isn't > very appropriate for the Web. > You are totally right. The problem is supporting both node.js and the web. As an example, LevelGraph works in both environment, but it uses node.js streams and browserify, which means the resulting 'bundle' is BIG. > I don't think performance matters even the least in our initial > comparison. Performance can always improve; the feature set, functionality, > compatibility, and security is much more important. (Perhaps there's some > cases where a better API design can allow for better performance, I would > consider that functionality, not performance.) > Agreed. Cheers, Matteo
Received on Friday, 11 October 2013 10:07:14 UTC