Author: James Courtier-Dutton Date: To: Hampshire LUG Discussion List Subject: Re: [Hampshire] help with algorithm
On 24 July 2010 10:46, Chris Dennis <cgdennis@???> wrote: > On 24/07/10 09:21, James Courtier-Dutton wrote:
>>
>> Can anyone come up with a better algorithm?
>
> No. What would the algorithm do in this simple example?:
>
> read x,y;
> data[x] := y;
> X would be unbounded, so no conclusion could be drawn.
> There's no general way of knowing what the bounds of the array should be.
> (Unless there is bounds-checking code, which there may not be.)
>
> Could your decompiler analyse the data area of the program, and work out,
> for example, that the data[] array starts at address A, and no other
> variables point to memory between A and B. The size of the array is then
> B-A.
>
> Or have I missed the point?
>
The "B-A" is also another algorithm that could also be used. You could
combine a number of "tests" and then use the most likely.
What I am trying to achieve is this.
If a human looking at the code could conclude that the array is of
fixed size, I want my code to do it automatically.
If a human looking at the code could not conclude array size, I am
happy that my code will not be able to.
The aim of my decompiler is to, as much as possible, automate the
tasks that a human would have to go through to decompile a binary.
My aim is this:
Decompile the binary into a .c file.
Compile the generated .c file, and the resulting binary should
function like the original binary file.
Another point is that I have decompiled hundreds of binary programs
manually, and I wanted to develop a tool to speed that process up.
Take your example: > read x,y;
> data[x] := y; You can immediately tell that you cannot determine the array size of data[].
Moving from that to "because it is unbounded" is a difficult step to do.
Well, obviously an easy step for this example, but if the code was
much more complex, it would be far more difficult to work it out.