On Week 39

edited 2011 Oct 10 in Developers
skyjake:

My brief hiatus from multiplayer debugging continued last week as I was focusing on build system maintenance and starting the remove-sdlnet branch. I must say that taking a break from the debugging has really improved my motivation for working on the project, particularly as we've reached a stage where we have a good reason to introduce libdeng2 to the main codebase. This is something I've been long wanting to do, and it will fulfill two important goals on the roadmap: enabling development with C++ and allowing the use of Qt. However, it is worth mentioning that C++ and Qt will only be available within libdeng2 for new code; any old code in the main Doomsday executable will remain the same as before. It would be possible to link C++ code into the Doomsday executable as well, but having tried this in the neworder branches I believe it is not really worth the trouble of updating all the headers and parts of the code to be C++ compatible.

The goal of the remove-sdlnet branch is to remove all dependencies on SDL_net by implementing equivalent functionality with Qt. The code will naturally be in C++ and will reside in libdeng2. To facilitate access from the legacy C codebase, libdeng2 will export a C wrapper API using which it is possible to access a subset of libdeng2 functionality.

If you've been following the unstable releases feed, you may have noticed that since the qmake switch there has been a lot more warnings in the gcc-based builds. This is because the warning options were much more lenient in the previous build settings. However, as it seems unlikely that we'll spend hours fixing hundreds of virtually harmless warnings, I have decided to disable most of them. For any new code I recommend applying strict compiler warning options, though.

This week I'm going to continue work on the remove-sdlnet branch. When done, it'll be merged back to the master and I can start improving the quality of the network traffic.

danij:

Last week got off to a slow start for me as the switching of our build system to qmake presented a few logistical issues on my Windows development system. After revising our build scripts somewhat, to replace the post-build "staging area" used on Windows with a "deploy install" and writing a few simple scripts to obtain Visual Studio solution and project files from our qmake based build system - I'm now back to having a functional development environment on this platform.

On Thursday NiuHaka from the Doom Ascension project contacted me about an odd problem he had encountered with Doomsday's lighting with high resolution Patches (which resulted in the automatically calculated light sources being positioned incorrectly). It turns out that Doomsday has hitherto positioned these based on the dimensions of the original IWAD resource rather than the high resolution replacement. This issue has been fixed in the master branch (and thus included in today's build #276).

As I mentioned in my development update a few weeks ago, I was really in need of a break from my debugging work in branch ringzero. I decided to take a to look at an issue with the simple map object shadows in Doomsday's world renderer. This ultimately resulted in my implementing an alternative algorithm for these based on that used for the projection of dynamic lights in the ringzero branch. The upshot being that shadows drawn with this algorithm do not suffer from the same issues as the simple approach (i.e., shadow polygons overhanging ledges, floating in mid air in self-referencing sector constructs, etc...). Instead however it can result in a fair bit of overdraw. Neither is ideal so my current plan is to implement a heuristic analysis of a map object's shadow-casting properties to determine which algorithm to use in a given situation.

In the process of working on the shadows I noticed that we were repeatedly calculating map surface tangent space vectors every frame in numerous parts of the renderer (e.g., dynamic light projection, light decoration plotting, etc...). As these vectors rarely (if ever) change, it was clear that caching their values into the internal map data representation was the way to go. I decided to implement this optimization in the ringzero branch as there have been too many other changes to the renderer since the point at which master forked.

Over the past couple of days I have been looking into replacing our use of Inno Setup for generating our Windows Installer, instead by leveraging the WiX Toolset. This work is looking rather promising thus far and will hopefully enable us to address some of the known shortcomings with our present installer (e.g., orphaned files and unintelligent upgrades).

This week my plan is to continue researching the possibility of using WiX and then if viable, implement and integrate it our build system. I'll return to the work on ringzero sooner rather than later, however.

Comments

  • This may be off topic, but I just entered map 06 in Plutonia II for the first time and noticed that the map has bad framerate at the beginning, at least, and found that turning off dynamic lights makes it much smoother, although not 100% smooth. Why do dynamic lights slow down certain maps? I know the Doomsday base code is old and that is probably the problem since it was mentioned by people on this forum to me already that performance suffers because of this, but what specifically causes this dynamic light performance problem? It probably has to do with the number of lights, but why does the number of lights hurt performance so much? Will this particular problem with performance be fixed in 1.9.0 final? Optimization is really a major priority, in my opinion, since it was screwing up the framerate to pretty much an unacceptable degree and I don't like turning off dynamic lights if I don't have to.
  • Optimizing known bottlenecks in the renderer is one of the important things we're planning to do. At least for me it's been too long since I've fired up a profiler and really gotten a good look at what's slowing things down. So I agree that before the next stable release we need to do some optimizing to avoid the worst slowdowns.
Sign In or Register to comment.