|Keep It Simple, Stupid|
Re: Re: American Flag?by trantor (Chaplain)
|on Oct 02, 2001 at 11:07 UTC||Need Help??|
It is possible indeed, if you consider acceptable a flag which is 117 ASCII lines long, with stars 7 lines "tall" and stripes 9 lines "tall".
Here's how. First, let's roughly split the flag into its areas:
Let's consider the height only. Area A is made of 9 rows of stars, area B is made of 7 stripes, Area C is made of 6 stripes.
Let x be the height of each star row, y be the height of each stripe, h the total height of the flag.
Then we have:
9x = 7y
Where the first equation shows that 9 rows of stars must be as long as 7 stripes, the second shows that the flag is 13 stripes high.
With some simple algebraic operations we have:
x = 7h / 117
Now, if we let h = 117, which is the minimum value satisfying these integer equations, we have that each row of stars must be 7 lines high and each stripe must be 9 characters high.
Doing this, areas A and B will both be 63 lines, area C will be 54 lines.
It is slightly imperfect because the flag has some space before the first row of stars and after the last, but it can work if each star is draw with a pattern like (dots instead of spaces for clarity, background and foreground swapped for clarity):
With a flag this big, one could even consider ASCII antialiasing :-)
Determining the length in characters, considering that the average font is not contained in a square cell, is left as an exercise. Drawing the star I considered each "pixel" being a couple of ASCII characters, this roughly gives a square using many fixed width fonts.
Area A could be 77 (7 characters each star * 11 columns) characters wide, out of a total width of 192 characters. This still gives a viewable image on a 1024x768 screen using a font with a 5x6 grid.
In conclusion, it is a rather interesting problem, which has lots of room for obfuscation, just think of the symmetry of one star for example, and the fact that it can be easily represented by an encoded bitmap.