Author Topic: Ubuntu: The Verdict  (Read 4179 times)

piratePenguin

  • VIP
  • Member
  • ***
  • Posts: 3,027
  • Kudos: 775
    • http://piratepenguin.is-a-geek.com/~declan/
Re: Ubuntu: The Verdict
« Reply #15 on: 12 July 2005, 12:37 »
I think there should be ONE repository storing packages (not deps or rpms, something designed specifically for this purpose with a name like "universal package") that contain the original source code for the package and some patches (INSIDE the "universal package" (like in a "/patches/" directory), at least for the more important patches) required to make the code compile cleanly under whatever circumstances, or add some important funcionality, or fix bugs. Then the distributors could WORK TOGETHER to keep this repository UP TO DATE, and release their own experimental patches that, once tested (by the distribution users, and by users of other distributions. Probably have a part of the repository house the expirimental patches.) and deemed secure and stable enough, get added as a patch to the "universal package".

ANYBODY could download a "universal package" straight from the repository, compile it and install it easily (using frontends, perhaps something like synaptic, and it could ask the user which patches to apply) on ANY distribution.

The distributors could even compile the "universal packages" for their users, and package them in RPM or DEP format and put them into their own repository. They would still gain from faster bug and security fixes. The only thing that would be missing is the users control over which patches are in use (which might cause issues for users of certain (noob) distributions). But it would have it's benefits.

Other operating systems (not just distributions) could also take advantage of this large repository of software, particularly GNU/Hurd, *BSD, and some more (probably). Just like the GNU/Linux distributions, they could provide their own patches (experimental or otherwise) to make the package work on that OS.

The authors of the software, of course, could take patches from the repository and apply them for the next release.
"What you share with the world is what it keeps of you."
 - Noah And The Whale: Give a little love



a poem by my computer, Macintosh Vigilante
Macintosh amends a damned around the requested typewriter. Macintosh urges a scarce design. Macintosh postulates an autobiography. Macintosh tolls the solo variant. Why does a winter audience delay macintosh? The maker tosses macintosh. Beneath female suffers a double scum. How will a rat cube the heavier cricket? Macintosh calls a method. Can macintosh nest opposite the headache? Macintosh ties the wrong fairy. When can macintosh stem the land gang? Female aborts underneath macintosh. Inside macintosh waffles female. Next to macintosh worries a well.

ksym

  • Member
  • **
  • Posts: 65
  • Kudos: 30
Re: Ubuntu: The Verdict
« Reply #16 on: 13 July 2005, 02:18 »
Quote from: piratePenguin
I think there should be ONE repository storing packages (not deps or rpms, something designed specifically for this purpose with a name like "universal package") that contain the original source code for the package and some patches (INSIDE the "universal package" (like in a "/patches/" directory), at least for the more important patches) required to make the code compile cleanly under whatever circumstances, or add some important funcionality, or fix bugs. Then the distributors could WORK TOGETHER to keep this repository UP TO DATE, and release their own experimental patches that, once tested (by the distribution users, and by users of other distributions. Probably have a part of the repository house the expirimental patches.) and deemed secure and stable enough, get added as a patch to the "universal package".

ANYBODY could download a "universal package" straight from the repository, compile it and install it easily (using frontends, perhaps something like synaptic, and it could ask the user which patches to apply) on ANY distribution.

The distributors could even compile the "universal packages" for their users, and package them in RPM or DEP format and put them into their own repository. They would still gain from faster bug and security fixes. The only thing that would be missing is the users control over which patches are in use (which might cause issues for users of certain (noob) distributions). But it would have it's benefits.

Other operating systems (not just distributions) could also take advantage of this large repository of software, particularly GNU/Hurd, *BSD, and some more (probably). Just like the GNU/Linux distributions, they could provide their own patches (experimental or otherwise) to make the package work on that OS.

The authors of the software, of course, could take patches from the repository and apply them for the next release.

HAHA! OLD!

There is nothing new in this idea ... tho this is quite good.

Heard of Gentoo GNU/Linux anyone?
I used that about a year, and it had all of the idea's above.

It was a decent distro, but started sucking cock later on ...
Installing software bloated my HD with development
headers, and there was NO mechanism to do reverse-dependencies,
eg. I could not easily remove an already installed package.

I'm back to Debian.
People are stupid.
So: All Operating Systems suck because the people who make them are mostly retards.
-- My piece of Neo-Zen Wisdom

piratePenguin

  • VIP
  • Member
  • ***
  • Posts: 3,027
  • Kudos: 775
    • http://piratepenguin.is-a-geek.com/~declan/
Re: Ubuntu: The Verdict
« Reply #17 on: 13 July 2005, 08:31 »
Quote from: ksym
Heard of Gentoo GNU/Linux anyone?
Yes.
Quote from: ksym
I used that about a year, and it had all of the idea's above.
Yes but Gentoo have their repository, Debian have theirs, Mandriva has theirs... It's all fucked up. If you read the first sentence properly:
Quote from: me
I think there should be ONE repository
I've never used Gentoo, though I might try it after this. I've started looking at FreeBSD, and it's package management is pretty similar (and very fecking good).

If this "ONE" repository existed and the distributors took it seriously, I see no reason that the FreeBSD guys couldn't contribute to it too... They get mostly the same bugs and security advisories as us (take a look).
"What you share with the world is what it keeps of you."
 - Noah And The Whale: Give a little love



a poem by my computer, Macintosh Vigilante
Macintosh amends a damned around the requested typewriter. Macintosh urges a scarce design. Macintosh postulates an autobiography. Macintosh tolls the solo variant. Why does a winter audience delay macintosh? The maker tosses macintosh. Beneath female suffers a double scum. How will a rat cube the heavier cricket? Macintosh calls a method. Can macintosh nest opposite the headache? Macintosh ties the wrong fairy. When can macintosh stem the land gang? Female aborts underneath macintosh. Inside macintosh waffles female. Next to macintosh worries a well.

worker201

  • Global Moderator
  • Member
  • ***
  • Posts: 2,810
  • Kudos: 703
    • http://www.triple-bypass.net
Re: Ubuntu: The Verdict
« Reply #18 on: 13 July 2005, 11:44 »
Can't help but notice that this is a form of computer fascism.  Consolidating and centralizing power/packages leads to dependence and inefficiency.  Overall, Linux is not developed this way, and the community will resist your attempts to steer it toward some sort of homogenization.  The "do what you like" marketing theory has been put to the test, and actually seems to produce quality products.  If you start making everybody do the same thing, that's Microsoftism.

The beautiful thing about standards is that there are so many of them!

ksym

  • Member
  • **
  • Posts: 65
  • Kudos: 30
Re: Ubuntu: The Verdict
« Reply #19 on: 13 July 2005, 13:45 »
Quote from: piratePenguin
If this "ONE" repository existed and the distributors took it seriously, I see no reason that the FreeBSD guys couldn't contribute to it too... They get mostly the same bugs and security advisories as us (take a look).

That is not possible.

You see, every Linux-distro has it's own base-system.
Each piece of software is uniquely tailored for
this base-system, statically --prefixed under /usr or /opt.
Each systems has it's own scheme on dealing with
soname dependencies, command-namespace dependencies and
package upgrading.

What this come's down to, is that a centralized repository
would need the distros' using it to be of the same
base-system-schema. And if that would be so, then they
would be, actually, ONE AND THE SAME SYSTEM ;D
AND we would have to brainwash EVERY fucking OSS-hacker
to believe into our "one-and-the-only" base-system
in order to make em port their software to our
Nazi-Linux.

The idea is good, but it just would not work.
Like  I said earlier, in order to make OSS scene co-operate,
you would have to be GOD, and throw all nay-sayers
to burning hells. Got it?
People are stupid.
So: All Operating Systems suck because the people who make them are mostly retards.
-- My piece of Neo-Zen Wisdom

piratePenguin

  • VIP
  • Member
  • ***
  • Posts: 3,027
  • Kudos: 775
    • http://piratepenguin.is-a-geek.com/~declan/
Re: Ubuntu: The Verdict
« Reply #20 on: 14 July 2005, 04:14 »
Oh for fuck sake. I'm after typing up a fecking huge reply in Firefox and just lost it due to middle-clicking outside the textbox. :mad:

Quote from: worker201
Can't help but notice that this is a form of computer fascism.
You are wrong. I wouldn't force this system on anyone.
Quote from: worker201
Consolidating and centralizing power/packages leads to dependence and inefficiency.
I get the dependence part. You could say that about anyone or anything that works together with someone or something else.

Previously I depended on Microsoft, then the Mandriva developers, then X, then Y. Which would you trust more though, out of them or my system? That's the important thing.

I don't get the ineficient part. Please elaborate on that. The main pro of my system, from what I can see, is that it makes our curently very inneficient system as efficient as possible. Curently, when there is a bug in some package, say, zlib, FreeBSD, Debian, Gentoo etc. are ALL working on a DIFFERENT patch and apply it to their own repositories. Inneficient.
Quote from: worker201
Overall, Linux is not developed this way
If it was it would be an efficient system and I could not dream of improving upon it, and this discussion would not be taking place.
Quote from: worker201
the community will resist your attempts to steer it toward some sort of homogenization.
If you think that I intend "to steer it [the community] toward some sort of homogenization", then you are mistaken.
Quote from: worker201
The "do what you like" marketing theory has been put to the test, and actually seems to produce quality products.
So you believe that it's that "do what you like" "marketing theory" that is the reason we have such high quality free software? I believe otherwise.
Quote from: worker201
If you start making everybody do the same thing, that's Microsoftism.
If I force them to, then maybe. I'm not gonna force anyone to do anything, so don't compare me to them fucking gaylords please.
 
Quote from: worker201
The beautiful thing about standards is that there are so many of them!
And what have I been thinking about doing the last while? Deleting standards? Is that what you think I intend on doing?

Creating standards, maybe.

Quote from: ksym
That is not possible.
In which case it will be abandoned as soon as all hope is lost.
 
Quote from: ksym
You see, every Linux-distro has it's own base-system.
 Each piece of software is uniquely tailored for
 this base-system, statically --prefixed under /usr or /opt.
 Each systems has it's own scheme on dealing with
 soname dependencies, command-namespace dependencies and
 package upgrading.
Whether these facts are a good or bad thing for the different distributions is arguable. Anyhow, like I've said before:
Quote
The distributors could even compile the "universal packages" for their users, and package them in RPM or DEP format and put them into their own repository. They would still gain from faster bug and security fixes. The only thing that would be missing is the users control over which patches are in use (which might cause issues for users of certain (noob) distributions). But it would have it's benefits.
Maybe that way ^^ should be the standard, but that's not exactly up to me (whover adopts it will define which method is 'standard').
Quote from: ksys
What this come's down to, is that a centralized repository
 would need the distros' using it to be of the same
 base-system-schema. And if that would be so, then they
 would be, actually, ONE AND THE SAME SYSTEM ;D
Did you miss the whole patches bit? And the whole distriputors-may-compile-own-packages bit?

You appear to not be understanding much of anything TBH. How did you cope with the can-be-shared-between--different-OSes bit? "SAME SYSTEM" yea fucking right.
Quote from: ksys
AND we would have to brainwash EVERY fucking OSS-hacker
 to believe into our "one-and-the-only" base-system
 in order to make em port their software to our
 Nazi-Linux.
No brainwashing. What I had in mind, is educating them and then letting them decide for themselves. But whatever.
 
Quote from: ksys
The idea is good, but it just would not work.
That's what you think.
Quote from: ksys
Like  I said earlier, in order to make OSS scene co-operate,
 you would have to be GOD, and throw all nay-sayers
to burning hells. Got it?
I'd be glad to prove you wrong. But wait, already done. They are co-operating, just not good enough.

Anyhow. This system I have in mind. I see nothing but benefits it could bring. Better freedom (user chooses what patches are applied. Distributors may use the universal repository to compile own binary packages for use by it's distro's users. The user may not need to know of the existance of the universal repository). Better convenienve (all source code and patches in the same repository. Can be compiled easily and cleanly (with the right patches)). Better cooperation and inherently, efficiency.
"What you share with the world is what it keeps of you."
 - Noah And The Whale: Give a little love



a poem by my computer, Macintosh Vigilante
Macintosh amends a damned around the requested typewriter. Macintosh urges a scarce design. Macintosh postulates an autobiography. Macintosh tolls the solo variant. Why does a winter audience delay macintosh? The maker tosses macintosh. Beneath female suffers a double scum. How will a rat cube the heavier cricket? Macintosh calls a method. Can macintosh nest opposite the headache? Macintosh ties the wrong fairy. When can macintosh stem the land gang? Female aborts underneath macintosh. Inside macintosh waffles female. Next to macintosh worries a well.

worker201

  • Global Moderator
  • Member
  • ***
  • Posts: 2,810
  • Kudos: 703
    • http://www.triple-bypass.net
Re: Ubuntu: The Verdict
« Reply #21 on: 14 July 2005, 06:31 »
Perhaps something like this could work.  Here's what you would need to do, I think.  Have your distribution system work like php.  Then the source code to all these programs gets dropped into the database.  When my computer running FC4 with stops by to pick up the latest release of transcode, the package manager looks at my system, and determines what flags are required to create a package custom-suited to my needs.  The package manager then gives these requirements to the distro system, which produces a package custom-fit for me.  It would also store a compressed copy of the package in the database, just in case someone else with similar requirements comes for the package.

In this system, packages are built on the fly based on distro.  So anyone using Fedora, Gentoo, YDL, SuSE, Debian, or some other Linux could get a package from it.

Of course, this is rougher than it sounds.  Basically, the package manager client is handing the distro system configure and compiler flags, and the system then builds an rpm (for example) with those criteria.

Anything else might seem like forcing a standard.  I think that being able to choose between apt, yum, rpm, yast, up2date, slapt, and others is part of what makes computers so cool - it takes all kinds.  Providing an efficient and simple way to get their packages, well that's fine.

(much of my last post was political hooey, although I do think Slackware is excellent proof that dollar capitalism and market pressure are not necessary to make a quality free product.  Patrick Volderking does it because he loves it, and everyone benefits from his love.  If only cars and keyboards were made that way!)

piratePenguin

  • VIP
  • Member
  • ***
  • Posts: 3,027
  • Kudos: 775
    • http://piratepenguin.is-a-geek.com/~declan/
Re: Ubuntu: The Verdict
« Reply #22 on: 14 July 2005, 08:25 »
Quote from: worker201
Perhaps something like this could work.
There is hope!
Quote from: worker201
Here's what you would need to do, I think. Have your distribution system work like php. Then the source code to all these programs gets dropped into the database. When my computer running FC4 with stops by to pick up the latest release of transcode, the package manager looks at my system, and determines what flags are required to create a package custom-suited to my needs. The package manager then gives these requirements to the distro system, which produces a package custom-fit for me. It would also store a compressed copy of the package in the database, just in case someone else with similar requirements comes for the package.
Sounds good.
Quote from: worker201
In this system, packages are built on the fly based on distro.
Hmm hmm... I dunno about that TBH. Although, they could provide patches for each and every package to make it compile exactly correctly for their distribution. Which I probably would've needed anyhow. In which case, such a system would (read: should) be piss easy to implement. Wouldn't even need to use the precous resourses of the core repository server(s).
Quote from: worker201
Anything else might seem like forcing a standard. I think that being able to choose between apt, yum, rpm, yast, up2date, slapt, and others is part of what makes computers so cool - it takes all kinds. Providing an efficient and simple way to get their packages, well that's fine.
Well, the raw core repository will still be open for reading by people like moi. And there'd always need to be some easy way to get packages from that core repository even for the distributers to get. Making the packages simple to compile (As in, straightforward like './configure && make && make install') is one goal. The distribution-specific patch for every-single package is a requirement for that. Althogh.. maybe it could be worked around..... Like, the patch used by Fedora for gzip would be pretty similar to the patch used by Fedora for bzip2 and tar and binutils and coreutils, but that'll need to be looked into. What all is different 'tween distros? (--prefix, and I know little more (probably --manpagedir and friends). Then there's library stuff that I know nothing about.).
If automation worked (as in './configure && make && make install' worked flawlessly all the time on every major distro) this system could be classic. Then that web-based thing would be possible, as well as 'upkg tar' on every single distro, to get the tar source code, apply whatever patches you select (or have an -auto option) , compile and install.

Anyhow, I spent the last 3 hours typing this out (I took my time, was browsing and stuff while doing it.). It's not necesarily completely complete, I've got more to add I think, it goes into quite alot of detail...:

Quote
The universal package repository contains all the source code, untouched. The exact version of the source code in the repository is the exact version of the source code as retrieved from the package author (usually from the package's website). Once the source code is in the repository, it is never modified. Instead, patches are stored in a different directory and applied before the source code is compiled. This provides for added flexibility and freedom, because whoever is compiling the package (usually a distributor or user) has the added freedom of choosing which patches get applied to what they install.

Patches will be given a number used to determine their importance, as evaluated either by the package maintainer or a privileged group of individuals (who have obviously gained their privileges. The main people fitting into this category would be security experts and the like.). Distributors, who are generally expected to provide frontends to the official command-line tools used to access the core repository, may override the patches importance value. They could also submit recommendations to the maintainers of the package about the patch importance value.

If the maintainers are iresponsible, someone may contact the core maintainers who have the power to remove package maintainers from their duties and add replacements for them. When the user tries to compile or download a package from the core repository, they may use either the official command-line tools, or the frontend that usually comes with the distribution. Either way, they will be given a list of available patches, as well as their descriptions, importance value (either directly from the core repository or from the distribution's overrides (only available when using the distribution frontend, or any frontend using an updated distribution-specific settings (likely retrieved from the distribution website.).).

There is one rather special patch, obey_uni-pkg-standard.patch. This patch usually only patches the configure (TODO: learn about and probably mention Makefile.in and friends here, assuming they are relevant (which I _think_ they are)) script provided by most packages. It makes the package obey the uni-pkg standard for installing packages. The uni-pkg standard has yet to be defined, but by the time this system gets implemented, assuming it does get implemented, we expect that this standard will be clearly defined. It will only be provided for packages that do not already obey the standard, and probably not even that. Distributors are expected, if they offer source packages to their users, to provide a similar patch for their distribution setup in a distribution-specific folder of the repository. This folder should hold absolutely nothing else.

Whenever a bug is found, a patch is made by the distributors or others (who all operate together), and sent to the package maintainers for inclusion in the repository. When the package maintainers add it to the repository, they give it a very high importance value, especially if it's a fix for a security bug. When users update that package, they will get, possibly among others, this patch and it would be applied to the source code and the package rebuilt and reinstalled. Distributors could automate this process in frontends. Anyone installing the package latter on will see that it is an important patch and will (usually) include it when choosing which patches to apply to the source code. There may be a sub-directory of the patches directory of each package for storing experimental patches, purely for testing purposes.

The original package creators are more than welcome, and recommended, to use the patches from the repository, and include some of them for a next release. Whenever a new version of a package is released, the source code is added as an entirely new package to the repository with a fresh and empty patches directory. Any patches from previous version still relevant may be copied across after optionally being modified. Now, whenever a user updates a package, they will be told about the newer version, and most likely will chose to download it instead (they will be recommended to), apply whichever patches are available and appeal to them, compile and install. The older version would be uninstalled also. Distributors may disallow updating certain packages for whatever reason, but only if the user uses their frontend.

That's source packages. Source packages have their advantages and their disadvantages. As does binary packages, discussed now. Binary packages, officially, are not supported. However, the repository stores all the source code and the patches. So, the distributors may compile the source packages, package them under their own package format and distribute them to their users through their own repositories. Tools are likely to be built to automate this purpose, though they will not officially be supported.
"What you share with the world is what it keeps of you."
 - Noah And The Whale: Give a little love



a poem by my computer, Macintosh Vigilante
Macintosh amends a damned around the requested typewriter. Macintosh urges a scarce design. Macintosh postulates an autobiography. Macintosh tolls the solo variant. Why does a winter audience delay macintosh? The maker tosses macintosh. Beneath female suffers a double scum. How will a rat cube the heavier cricket? Macintosh calls a method. Can macintosh nest opposite the headache? Macintosh ties the wrong fairy. When can macintosh stem the land gang? Female aborts underneath macintosh. Inside macintosh waffles female. Next to macintosh worries a well.

worker201

  • Global Moderator
  • Member
  • ***
  • Posts: 2,810
  • Kudos: 703
    • http://www.triple-bypass.net
Re: Ubuntu: The Verdict
« Reply #23 on: 14 July 2005, 09:05 »
Wow, this is soooo off topic.

I think the system could be even easier than that.  I've never had to apply a patch before, and I think your patching system might be avoidable.  I use apt for my packages, and instead of releasing patches, they release minor or micro version releases.  Like, if foo-1.4.5 gets a really small tweak, it comes as foo-1.4.5-a or something.  The apt system just kills the old one and installs the new one.  So instead of having a complicated patch system, perhaps a micro-versioning system would be more efficient.

An example of how things could work:
Let's say I want to install transcode-1.0.0.  Here's the actual configure line I used when installing transcode-1.0.0b:
Code: [Select]
% ./configure --enable-mmx --enable-sse --enable-sse2 --enable-freetype2 --enable-lame --enable-ogg --enable-vorbis --enable-theora --enable-libquicktime --enable-a52 --enable-libmpeg3 --enable-libxml2 --enable-mjpegtools --enable-imagemagick --with-libavcodec-includes=/usr/include/ffmpeg
% export CFLAGS="-O2 -fomit-frame-pointer -mmmx -msse -mfpmath=sse"

Instead of all this hassle (which I actually kinda enjoy), there should be some kind of intelligent program which will bring up a dialogue asking me what options I am interested in, and recognize what options I have resources for.  Let's say I don't have libtheora installed.  Then the program says "Get and enable theora support?" and then maybe have an explanation of what theora is.  If I say yes, then it writes --enable-theora to a config script.  Of course the configure script already has the personalized stuff I need in it, like hostname, arch, and all that crap.  Then it gets the source and builds a package in via my packaging system, and installs it.  As an option, I can store the package locally, or delete it after installation.  Whether I delete it or not, the configure info is kept, so replacing the package is easy enough.

You know what, this is starting to sound like not much more than a giant CVS system.  Except you don't give the code back after you check it out.  Like a library where you get to keep the books.  I bet all the technology to do this could actually be scraped out of some existing things, like cvs, curl, doxygen, autoconf, and automake, for example.

Just a thought.  I don't even know if we're talking about the same thing.  What I envision is a system that devlivers code to the client, who then personalizes it.  No need for developers to waste time on building installation packages, and no need for users to google all day trying to find the right package for their system.  Your computer gets the source and knows what to do with it.

It would also be nice to have a smart archive, too.  So if I want to get a program that splices mpeg movies together, it will recommend one for me.  And then get it.  Instead of me having to read untold pages of documentation before finding out that mpgtx is the program I want.

piratePenguin

  • VIP
  • Member
  • ***
  • Posts: 3,027
  • Kudos: 775
    • http://piratepenguin.is-a-geek.com/~declan/
Re: Ubuntu: The Verdict
« Reply #24 on: 15 July 2005, 00:42 »
Quote from: worker201
I think the system could be even easier than that. I've never had to apply a patch before, and I think your patching system might be avoidable.
'patch -Np1 -i ../patches/fix-whatever.patch', simple as that. And it'd be automated. It'll ask what patches you want in, then it'll path the source code, then it'll compile, then it'll install.
Quote from: worker201
I use apt for my packages, and instead of releasing patches, they release minor or micro version releases.
Patches are staying. Distributions may use microversions in the packages in their repositories, using the patches they like from the universal repository.
Quote from: worker201
So instead of having a complicated patch system, perhaps a micro-versioning system would be more efficient.
Patches are more efficient and less complicated. I dunno how micro-versioning could possibly work in this system, unless the distributors do it with their packages (easy).
Quote from: worker201
An example of how things could work:
Let's say I want to install transcode-1.0.0.  Here's the actual configure line I used when installing transcode-1.0.0b:
Code: [Select]
% ./configure --enable-mmx --enable-sse --enable-sse2 --enable-freetype2 --enable-lame --enable-ogg --enable-vorbis --enable-theora --enable-libquicktime --enable-a52 --enable-libmpeg3 --enable-libxml2 --enable-mjpegtools --enable-imagemagick --with-libavcodec-includes=/usr/include/ffmpeg
% export CFLAGS="-O2 -fomit-frame-pointer -mmmx -msse -mfpmath=sse"
Instead of all this hassle (which I actually kinda enjoy), there should be some kind of intelligent program which will bring up a dialogue asking me what options I am interested in, and recognize what options I have resources for.
That could be added into the client, I think.
Quote from: worker201
You know what, this is starting to sound like not much more than a giant CVS system.
It is alot like CVS, but it is not the same. We couldn't use CVS in the repository, because you wouldn't be able to chose which patches get applied.
Quote from: worker201
I bet all the technology to do this could actually be scraped out of some existing things, like cvs, curl, doxygen, autoconf, and automake, for example.
Alot of it will be.
Quote from: worker201
Just a thought. I don't even know if we're talking about the same thing. What I envision is a system that devlivers code to the client, who then personalizes it. No need for developers to waste time on building installation packages, and no need for users to google all day trying to find the right package for their system. Your computer gets the source and knows what to do with it.
Automation will be possible, but because there are so many distributions, they each need to provide a patch for each package to make it compile properly for _their_ system. Then the client can do the rest easily.
Quote from: worker201
It would also be nice to have a smart archive, too. So if I want to get a program that splices mpeg movies together, it will recommend one for me. And then get it. Instead of me having to read untold pages of documentation before finding out that mpgtx is the program I want.
I'm sure that could be added to a frontend or something.
"What you share with the world is what it keeps of you."
 - Noah And The Whale: Give a little love



a poem by my computer, Macintosh Vigilante
Macintosh amends a damned around the requested typewriter. Macintosh urges a scarce design. Macintosh postulates an autobiography. Macintosh tolls the solo variant. Why does a winter audience delay macintosh? The maker tosses macintosh. Beneath female suffers a double scum. How will a rat cube the heavier cricket? Macintosh calls a method. Can macintosh nest opposite the headache? Macintosh ties the wrong fairy. When can macintosh stem the land gang? Female aborts underneath macintosh. Inside macintosh waffles female. Next to macintosh worries a well.

ksym

  • Member
  • **
  • Posts: 65
  • Kudos: 30
Re: Ubuntu: The Verdict
« Reply #25 on: 17 July 2005, 09:03 »
JUST REMEMBER:

I do NOT want any non-base-system spesific into my /usr
-directory, or I will sue your ass to the highest court ...
if possible ;)

And there must be a way to determine if I want ONLY the
runtime libraries/binaries installed, so that the
development includes, m4-macros, pkgconfig entries are
installed ONLY if I want to!

Make the system behave so, that these software are installed
under /opt/unipkg/ -hierarchy.

For easy maintenance, each package should be installed
in it's own, isolated directory, eg. /opt/unipkg/ .
Each app can then be launched either separately from it's
directory, or symlinks made to it's binaries. These symlinks
would be stored to, for example, /usr/local/bin.

A version control mechanism MUST be provided. I want to
be able to decide EXACTLY what version of development
headers I want to use for my project.
This could be done quite simply by installing the packages'
binary images into /opt/unipkg//,
and install includes/pkgconfig entries/m4-macros into
/opt/unipkg// .

The would be the Real version of the package,
eg. 1.2, and the would be the version
of a compatible interface, like 1. Then we could determine,
that the development files in /opt/unipkg//1
are compatible with the runtime binaries in directories
/opt/unipkg//1.x ...

And if the package uses some other versioning scheme, there
should be some other mechanisms to resolve compatibilities ...
but deterministic versioning support is a must!


A smart ldconfig manager is also needed, so that each library
is installed in it's own directory, and the directory entry is
then added to /var/unipkg/ldd/ld.so.conf, parsed with
ldconfig to a temporary /var/unipkg/ldd/ld.so.cache,  
which is used when launching Unipkg-spesific applications.

When installing a package with some runtime libraries,
the sequence would go:
Code: [Select]

check if the packages /opt/unipkg///lib -path is in the file /var/unipkg/ld.so.conf, and if not, add it there

ldconfig -f /var/unipkg/ldd/ld.so.conf -C/var/unipkg/ldd/ld.so.cache

To launch a unipkg app, the command sequence could be
something like this:
Code: [Select]
ldconfig -N -X -f /var/unipkg/ldd/ld.so.conf -C/var/unipkg/ldd/ld.so.cache && /opt/unipkg/MyApp/bin/myapp

Well, this is my idea on how package management SHOULD
be done. I really hate the way modern distros spread their
libraries/binaries/whatever under /usr -hierarchy.
It is totally chaotic ... it just sucks!

Anyways, I really hate these defects in the current
GNU/Linux distributions:
- no support for EXACT development component versioning,
eg. I have no EASY way to determince exactly which interface
version of a library I wish to use during program linking
- no way to dynamically relocate package, since all
software is statically prefixed to /usr ... this is
a fucking retarded way to install software. What is
so deadly wrong in installing each software into it's
own, isolated directory structure?
- ldconfig has many options, which allow dynamic relocation
of libraries (eg. each lib CAN be in it's own directory),
but these options are not used ... pff

I started ranting again, but I do it for the sake of the
whole OSS scene. They do not know how wrong they do things ...
People are stupid.
So: All Operating Systems suck because the people who make them are mostly retards.
-- My piece of Neo-Zen Wisdom

piratePenguin

  • VIP
  • Member
  • ***
  • Posts: 3,027
  • Kudos: 775
    • http://piratepenguin.is-a-geek.com/~declan/
Re: Ubuntu: The Verdict
« Reply #26 on: 17 July 2005, 17:30 »
Quote from: ksym
I started ranting again, but I do it for the sake of the
whole OSS scene. They do not know how wrong they do things ...
All that stuff you describe could be used in a seperate package manager, like RPM or whatever. Someone can package the source ('universal') packages into and RPM and distribute that. And if RPM doesn't suffice, they could use/invent their own package manager.

I installed FreeBSD yesterday (again), it's ports collection is very nice, and to my surprise, it *does* used patches to make the changes to the original code! And then after that, all it is is 'make && make install', and the package is compiled and installed. Bloodey brilliant.

So yea, nothing innovative here.
"What you share with the world is what it keeps of you."
 - Noah And The Whale: Give a little love



a poem by my computer, Macintosh Vigilante
Macintosh amends a damned around the requested typewriter. Macintosh urges a scarce design. Macintosh postulates an autobiography. Macintosh tolls the solo variant. Why does a winter audience delay macintosh? The maker tosses macintosh. Beneath female suffers a double scum. How will a rat cube the heavier cricket? Macintosh calls a method. Can macintosh nest opposite the headache? Macintosh ties the wrong fairy. When can macintosh stem the land gang? Female aborts underneath macintosh. Inside macintosh waffles female. Next to macintosh worries a well.

piratePenguin

  • VIP
  • Member
  • ***
  • Posts: 3,027
  • Kudos: 775
    • http://piratepenguin.is-a-geek.com/~declan/
Re: Ubuntu: The Verdict
« Reply #27 on: 17 July 2005, 22:37 »
On the topic of package management, this project seems quite interesting.
"What you share with the world is what it keeps of you."
 - Noah And The Whale: Give a little love



a poem by my computer, Macintosh Vigilante
Macintosh amends a damned around the requested typewriter. Macintosh urges a scarce design. Macintosh postulates an autobiography. Macintosh tolls the solo variant. Why does a winter audience delay macintosh? The maker tosses macintosh. Beneath female suffers a double scum. How will a rat cube the heavier cricket? Macintosh calls a method. Can macintosh nest opposite the headache? Macintosh ties the wrong fairy. When can macintosh stem the land gang? Female aborts underneath macintosh. Inside macintosh waffles female. Next to macintosh worries a well.

worker201

  • Global Moderator
  • Member
  • ***
  • Posts: 2,810
  • Kudos: 703
    • http://www.triple-bypass.net
Re: Ubuntu: The Verdict
« Reply #28 on: 18 July 2005, 19:36 »
Quote from: piratePenguin
On the topic of package management, this project seems quite interesting.


If grandma can't install an rpm package, why do we want to make it so she can install from source?

I like Linux like it is, really, and I don't want anything to be dumbed down.

piratePenguin

  • VIP
  • Member
  • ***
  • Posts: 3,027
  • Kudos: 775
    • http://piratepenguin.is-a-geek.com/~declan/
Re: Ubuntu: The Verdict
« Reply #29 on: 18 July 2005, 21:53 »
Quote from: worker201
If grandma can't install an rpm package, why do we want to make it so she can install from source?
Erm, I dunno about yer grandma but most users do know how to install RPMs on an RPM based distro (e.g. Mandriva). There's graphical frontends all that now.
Quote from: worker201
I like Linux like it is, really, and I don't want anything to be dumbed down.
I like the way it works, to an extent. If I had the chance, I'd make some changes. But still, it's working alright.

What do you mean by "dumbed down" though?
"What you share with the world is what it keeps of you."
 - Noah And The Whale: Give a little love



a poem by my computer, Macintosh Vigilante
Macintosh amends a damned around the requested typewriter. Macintosh urges a scarce design. Macintosh postulates an autobiography. Macintosh tolls the solo variant. Why does a winter audience delay macintosh? The maker tosses macintosh. Beneath female suffers a double scum. How will a rat cube the heavier cricket? Macintosh calls a method. Can macintosh nest opposite the headache? Macintosh ties the wrong fairy. When can macintosh stem the land gang? Female aborts underneath macintosh. Inside macintosh waffles female. Next to macintosh worries a well.