Archive for July, 2008

Long-lasting drag/drops in ark, and xdnd timeouts

July 17, 2008

I’ve been working on trying to get Ark’s drag and drop to work again, and one of the goals I set is to get it more stable than before. One of the most important things to me then was to trash the old method, which was to first extract the file when the drag was started, and then let the user place it. This failed bad whenever you would try to drag a large file from ark and annoyed many a user.

So I thought, this time I’ll find a way to extract the info only when the drag is actually successfully completed. So following a tip from David Faure on irc, I happily started implementing QMimeData and making my own extract-files custom mimedata class. It would work by extracting the file when retrieveData is called, while at the same time showing a progress dialog. Now, laying some weird interactions with dolphin aside, what surprised me was the buggy behavior I saw once the extractions actually took more than 5 seconds. I eventually found this limit to be exactly five seconds, and grepping around in qt’s kernel classes, in the x11 dnd implementation I found this:

// timer used to discard old XdndDrop transactions
static int transaction_expiry_timer = -1;
enum { XdndDropTransactionTimeout = 5000 }; // 5 seconds

I didn’t dig much more than this, because some google searches also seemed to confirm that getting drag and drop implementations seem to be mostly timeouted. So what do I do know? Is this a bug in the dnd implementation, or am I simply trying to use drag/drop the wrong way? Is there another way I can do this? To simplify, here are the constraints I am trying so hard to get around:

  • The solution must be using drag and drop
  • The destination application will only receive a “text/uri-list” mimedata object with the location of temporarily extracted files (ie no custom mimetypes the receiving app needs to have implemented)
  • The extraction should only take place once the drag has been completed successfully
  • The extraction should be allowed to take long times (people extract really large stuff these days), without bugging up either the source or destination application.

So is there a way out of this?

Oh, and if you’d like to try my test application for these long delays, it can be found here.

EDIT: The blog linked me automatically to an entry about XDS: http://en.wikipedia.org/wiki/Direct_Save_Protocol
Might this be what I want?

Ark: service menu, password protection files and more

July 12, 2008

Time for another update on my Ark changes!
Let’s just do it simple, I’ll put some screenshots here and I’ll explain along:

The batch extract/extraction progress

This is probably the feature I’ve been working most on. After modifying the libarchive plugin and doing some calculating of how much is to be extracted, the extraction shows very nicely and accurately how far it has come in the total extraction. It will also extract many files in a row when given from the command line. Now, some users might wonder (some might even complain) on why I’m even making a command line tool with gui feedback at all. Some of the other users again, will understand immediately just why this comes in handy, which brings us to the next point:

The dolphin/konqueror service menu


With the batch extraction in place and ark taking lots of nice parameters, implementing the service menu for dolphin/konqueror is a breeze. I haven’t completed it fully so far, but above is a quick preview of what’s working so far. Next up here will be a small submenu, showing additional entries such as “Extract to…”, “Extract all to subfolders”, etc. (I haven’t really decided which use cases deserve a menu entry here, feel free to discuss this in the comments section).

The (once more) redesigned extract dialog


So after playing with several solutions of how to actually informing the user of the auto-subfolder mechanism, I eventually decided that the best way to make it natural while not doing too much uncontrollable magic in the background is to first check if the archive is a single folder archive, next use this information to enable/disable the checkbox and informing the user why this choice has been made.

Password protected archives

Finally, there is the password protected rar files that was mentioned in the comments by a user in a previous blog entry. At first I was surprised to see that there was simply no support for passwords at all in the new archive framework, but after coding a little bit I was once more again impressed about how solid the ark codebase is, and how easy it was to add this extra property to the display. So far rar and zip files are checked for password protection, and the display reflects this. Next up is querying the user for the actual password, and finally getting that password all the way to the extraction code.

Another thing that surprised me was that libzip, which is what ark uses for zip files, doesn’t support reading/writing password protected archives yet. A way around this now would be to either 1. switch completely to using the command line zip utility, like the way rar is handled now, 2. switch to the cli zip util only for password protected files, or 3. pull out the relevant extraction functionality from the cli zip util, and include this as another zip plugin for ark. What do you think?

Ark: some help on software design please…?

July 1, 2008

The last few days I’ve been a little stuck on the Ark coding due to design decisions, but eventually figured a blog post to the community would get the right answers right away. (What I’m currently focusing on building is building a batch extract interface for ark, accessible through command line arguments and only shows the user the progress in a simple progress dialog, exiting when finished.)

Ark is divided into several parts:

  • part – the kpart along with the interface
  • app – the (relatively) trivial application that houses the kpart
  • kerfuffle – the extraction library base
  • plugins – the library plugins

The problems showed up when I tried to let the main app also take some arguments to instead of opening the file in a part, opening the part hidden and extracting the files one by one through it. It works fine with a single file, but once I want to do it sequentually I need some kind of a signal to do it sequentually. I went back and forth between several seemingly possible solutions, and eventually questioned how proper it actually is to use the kpart in this fashion (eg. using it almost like a library, and not showing the widget to the user). My motivation was to reuse the archive handling code already in ark’s kpart code, but if this reuse again introduces lots of ugly workarounds to actually make it work I’m not so sure anymore.

So here’s the solutions I currently see possible, I’d like some feedback on which one is the “KDE way”:

1. Ignore the need for a “finished” signal, and instead use an isBusy() function with a QTimer. ¬†Extraction is still done through the KPart
My initial approach, aiming towards shaking as little as much of the existing ark codebase.

2. Start with a new “batch” subfolder, an command application that does not use the KPart, but instead kerfuffle extract library and its plugins.
Would need a lot of reorganising, but could also be a way of further single out the actual extracting code from the kpart into externally available classes. Could end up with some code duplication and/or reimplementation, for example with regards to error handling during extraction. Could also to a light degree split coding efforts between the batch interface and main application.

3. Abstract the extraction into yet another class, and recode the KPart to use this instead (maybe even treating the single extraction as a special case of a batch extract).
The batch operation will be then handled in the same application as the one housing the kpart. Would also probably be the source of new bugs, but fixing them would stabilize extraction/error handling in both the kpart and the batch extract interface. What kind of stops me with this one is I’m unsure if what I’m doing here is overabstracting the already abstract kerfuffle interface, or if it can be seen as adding another layer for simplifying the lowlevel kerfuffle library.

4. Is there a better way out of this chaos?
Maybe just give up? Do people really want the batch extract interface?

To summarize it, am I trying to use KParts in a way seen as incorrect? How much abstraction/ripping the code apart is just right? I kind of keep scaring myself with thoughts of future problems with bad decisions leading to tons of bugs and responsibilites i can’t handle..

EDIT: The kerfuffle archive was actually much easier to handle than I thought before writing this post. I’ve reorganized a few files now and found a quite elegant solution (variant of #2), avoiding code duplication by moving some parts of the KPart into the kerfuffle library. Thanks for the comments received.