Future plans for DoomsDay Engine & some *nix-related bugs.

2»

Comments

  • Fak wrote:
    Yagisan wrote:
    PRO TIP = int != pointer

    I've never programmed to 64bit cpus, but I thought the standart definition for 'int' was the size of the biggest register.
    Well, that was true with a 16bit cpu I worked. I also thought that a casting from a pointer to an int was always possible, as it was written in 'reiterpret_cast' definition.
    It's very much not always possible to cast from int to pointer (and vice versa) as the size of int and pointer are not always the same. It's rather bad (but sadly common) practice to cast from int to pointer. 'reiterpret_cast' is also a C++ feature - doomsday is at present C (excluding parts of the new order branch)

    Data size types are both compiler and platform dependent. Heres an example.

    Most 32 bit systems today (eg Windows, Linux, *BSD, Solaris, OSX) use the ILP32 data model. This is:
    SHORT 16 bits INT 32 bits LONG 32 bits LONG LONG 64 bits POINTER 32 bits

    64 bit *NIX systems (eg Linux, *BSD, Solaris, OSX) use the LP64 data model. This is:
    SHORT 16 bits INT 32 bits LONG 64 bits LONG LONG 64 bits POINTER 64 bits

    64 bit Windows use the LLP64 data model: This is:
    SHORT 16 bits INT 32 bits LONG 32 bits LONG LONG 64 bits POINTER 64 bits

    Other 64 bit systems (That are not very common) have been known to use the ILP64 data model. This is:
    SHORT 16 bits INT 64 bits LONG 32 bits LONG LONG 64 bits POINTER 64 bits

    As a point of interest, Windows 3.x and Mac OS 9 and earlier, although running on 32bit cpus, used the LP32 data model.
    SHORT 16 bits INT 16 bits LONG 32 bits LONG LONG doesn't exist - but would be 64bits if a compiler supports it POINTER 32 bits

    Due to Windows 64 and 64bit *NIX choosing different data models, some care needs to be taken when ensuring 64 bit clean code is being made to run on both platforms. Fundamentally it comes down to - don't assume sizes, and use the compilers supplied pointer types - they will be the right size for your platform. That generally means use C99 or C++ to get the right size types.
  • In these situations, a typedef like intp_t would be better, so that you define the right one in each system.
    But in which cases are pointers used as int in deng?
  • At this point there are relatively few (pertinent) instances of type size assumption left in the deng code base. Those which remain are primarily centered around the saved game reader/writer and the jHexen-specific ACS script sub-systems. The saved game reader/writer (the biggest remaining offender) is due to be replaced entirely in an upcoming 1.9.0-beta release, so the issues currently therein have not been addressed. The issues in jHexen's ACS script system are fairly innocous but they too will be addressed in a future release.

    In general we use the host system's default pointer size in nearly all instances.
  • Does that mean that the saves are not compatible with the original Doom or just means that there is an issue in the reading/writting of those files?
  • Although both jDoom and jHeretic are able to load original saved games produced by DOOM.EXE and Heretic.exe respectively, all other reading of/writing to saved games (including jHexen) uses bespoke format(s) which are not compatible with those used by the original games.
  • I am recently re-interested in the DoomsDay engine seeing the great progress towards 1.9.0 and was searching for info such as possible Boom support. Having tried the engine some time ago I was disappointed by settings\engine quirks and a lack of Boom support.

    Version 1.9.0 is quickly eradicating all settings\engine quirks which is why I care so much again right now. Yes, Boom 'support' can be had from Risen3D however engine\settings quirks remain and installation is rigid. DoomsDay has a simple zip distribution for the previous stable release which I appreciate and hope continues for future stable releases as I play Doom and other games from usb between my laptop and netbook.

    Just wanted to let it be known that there are some others still hoping for Boom support from this engine.
  • HackNeyed wrote:
    Just wanted to let it be known that there are some others still hoping for Boom support from this engine.
    This has already been addressed in this topic:
    skyjake wrote:
    GuntherDW wrote:
    Is there any intention of supporting boom any time?
    If danij wants to go for it, maybe, but I couldn't care less about Boom myself, sorry.
  • I intend to implement full BOOM support eventually. Various BOOM features are already implemented but stuff like the generalized line types are still to go.
  • I'm glad to hear something semi official. I knew some Boom things were being added but I didn't know if they were just 'fixes' or really heading for full support.

    Thanks DaniJ! I look forward to the day. :)
  • skyjake wrote:
    Do I understand this correctly: this is an issue for developers (who compile the code) on a 64-bit Unix (Linux?) platforms. Assuming that a 32-bit build of Doomsday is packaged so that it depends on the requisite 32-bit support libraries it's not a problem for the end-user to install the package and have the dependencies pulled in automatically. Right? I'm just trying to assess how significant this issue is.

    The significance of the issue is that a sizeable portion of amd64 GNU/Linux users will not be able to compile or run doomsday at all. A good chunk of the remainder will be turned away when they see the size of the required dependencies. Another segment of the population would be able to run it, if only they could figure out how to build it. Do note that on these platforms, "ordinary users" compile code too. It is natural and commonplace for software to be distributed only in source form. Building an i686 application with an amd64 toolchain is possible but complicated -- especially since "32-bit compatibility packages" are typically missing parts required to actually build software against them.

    For example, on a Gentoo system, the ability to run a single 32-bit x11 application requires over 100 megabytes of additional libraries to be installed. On the other hand, compiling doomsday and stripping the binaries resulted in less than 5.

    Furthermore, even if you expect everyone to use binary packages, it is an issue for you if users cannot compile the software easily on their machines. Upon encountering a problem with a piece of software, many GNU/Linux users immediately turn to the source code. They try to compile the program and fix the problem.
    If they manage to fix the problem, they send a patch. If they cannot fix the problem, they send a bug report. If they cannot compile the program, they delete it.
  • Ok, thanks for the clarification.
  • Hey, I'm not sure if it's been asked before, but if I may:
    Is it planned for Doomsday, or at least been thought upon, that its physics (as minimal as they are) have the option of being calculated with ATI Havok/Ageia-Nvidia PhysX?
  • Psychikon wrote:
    Hey, I'm not sure if it's been asked before, but if I may:
    Is it planned for Doomsday, or at least been thought upon, that its physics (as minimal as they are) have the option of being calculated with ATI Havok/Ageia-Nvidia PhysX?
    That has not been considered, nor is it being planned at the moment. Doom physics are so exceedingly simple that I don't really see the point of using a dedicated physics library. If at some point it becomes relevant to do more complicated physics calculations (e.g., for "ragdoll" models), the support library options are of course investigated.
  • Psychikon wrote:
    Hey, I'm not sure if it's been asked before, but if I may:
    Is it planned for Doomsday, or at least been thought upon, that its physics (as minimal as they are) have the option of being calculated with ATI Havok/Ageia-Nvidia PhysX?
    That sort of thing is being worked on in Shinka. Link in my sig.
  • Yagisan wrote:
    That sort of thing is being worked on in Shinka. Link in my sig.
    I figured that it might have been a possibility with Shinka, 'cause I thought you were going to end up causing it to be more like a modern game in its function.

    I was concerned about the Doomsday side for a couple of reasons:
    -I remember that a mention was being made about a bottleneck, and that the current build was CPU-intensive where it shouldn't have to be
    -I thought of the fact that a massive amount of particles ( :D :yay: :thumb: :party: :crazy: ) with realistic physics, if they were implemented, would be ideally run (possibly) through a separate GPU or a PhysX card

    It was just a thought really, but I thought that I might throw it out there, 'cause I'm not really sure how complex things are or how they're going to get or how complex they could get in theory.

    I understand that Doomsday has its own vision and Shinka has its own, and that the idea behind Doomsday was never to go over-the-top with all the features, but mainly graphical stuff. In other words, not going in the direction of EDGE or Zdoom, which are content with software-drawn, but extremely-highly-modified levels.

    It's too bad I'm not skilled at any programming language. I really wish that I could help out with bringing all the new releases to fruition. :cry:

    EDIT: I'm fairly competent with the pseudo-language GML (Game Maker Language)! :lol:
  • I looked through the Roadmap some (and I hope I'm not bothering), but I didn't find the answer to my question(s) where I looked.

    Here is/are the question(s):
    Because of the way the new engine will be handling Doom, Heretic, and Hexen, are the DEDs planned to have increased functionality and flexibility? If so, even to the point where scripts could be written for Doom levels, or shared events types for things from game type to game type? Also, would it possibly be possible that, in the future, this might allow for user-defined thing types etc.? I'm not sure if I'm being clear, because I'm not sure exactly what to call it... As an example (not what I was thinking), would polyobjects be theoretically possible in Doom for instance with the new engine?
  • I'm still waiting for the existing modding features to be fully re-enabled. XG has been crippled to the point of near uslessness for about 3 years now (since Beta5).

    It's been so long, that some in the Doom community now think that Doomsday has no new mapping features over the original DOS engine.
  • Psychikon wrote:
    Because of the way the new engine will be handling Doom, Heretic, and Hexen, are the DEDs planned to have increased functionality and flexibility?
    Increased functionality and flexibility for definitions is one of key goals, yes. (The "DED Reader 2.0".)
    Psychikon wrote:
    If so, even to the point where scripts could be written for Doom levels, or shared events types for things from game type to game type?
    Scripts for all games is another key goal, so yes again. I'm not sure about what you mean by the latter, but the general principle is to reduce restrictions and limitations in all games, when it comes to thing types, etc.
    Psychikon wrote:
    Also, would it possibly be possible that, in the future, this might allow for user-defined thing types etc.?
    Definitely. That's the reason for the making things flexible, to ultimately allow user-defining pretty much everything.
    Psychikon wrote:
    I'm not sure if I'm being clear, because I'm not sure exactly what to call it... As an example (not what I was thinking), would polyobjects be theoretically possible in Doom for instance with the new engine?
    Polyobjs will be available as a generic feature in all games in Doomsday 2.0. (Polyobjs are another important goal.)
    Vermil wrote:
    I'm still waiting for the existing modding features to be fully re-enabled. XG has been crippled to the point of near uslessness for about 3 years now (since Beta5). It's been so long, that some in the Doom community now think that Doomsday has no new mapping features over the original DOS engine.
    I understand the issue. However, while XG has a valid reason for existence (defining what line/sector types do), the design was flawed from the start in that it had to rely on the extremely limited DED parser. I've yet to evaluate XG's role in the future and what shape it'll take in the 2.0 architecture. Rest assured, though, that defining custom line/sector types will be possible in the future (in a way that's much more powerful than XG ever was, or could've been.)

    However, all of this won't be happening overnight: we'll be rolling out the modding capabilities over time. Some will happen before the 2.0.0 release and some after it.
  • @SkyJakeYou have answered all of my questions! Thank you, that is all very good to know!
Sign In or Register to comment.