Read Latex

Monday, September 13, 2010

Design Patterns vs. Components in Software Resuse

I have been pondering the dichotomy of Design Patterns vs. Reusable Software Components (RSC's). Are they they same thing, or are they opposites? RSC's are the integrated circuits of software while design patterns are reusable chunks of observable interactions that appear repeatedly. A design pattern is not an algorithm for performing some calculation. It is a higher level aggregation, often with less specificity. Thus design patterns are not themselves implementations of solutions to computational problems, or interaction scenarios. Similarity vs. Uniqueness is a controlling idea here.
- Agile Modeling

A key idea in clarifying this is to consider the difference between functional blocks of capability versus the interconnections between functional blocks. Another useful idea is that of process.  Functional blocks, at any level of aggregation can be reduced to a single idea. One has inputs to the block that produce outputs from the block. Ensembles of functional blocks can always be replaced by a single equivalent functional block that abstracts away the internal complexity of the contents. Functional blocks make mathematical and algorithmic descriptions look like circuits. In fact, a software functional block can always be replaced by an equivalent piece of hardware and vica versa. This fact has not yet been fully exploited in the current software revolution, otherwise you could buy Microsoft Word on a chip.


Imagine taking the 23 design patterns, overlaying them on each other in the 253 possible ways and asking what are the similarities and differences between them and their subsets?


These differences are largely the interconnects that glue together portions of the patterns, as opposed to the blocks of UML themselves. UML is a useless modeling language and violates my own notions of how graphical depictions of things should work, but I digress.


Resolution of the aforementioned dichotomy takes place when we look at any printed circuit board (PCB) of reasonable complexity.
- develec.com
Every PCB has the need for power, input/output channels, cooling, but these are all just functional relationships which can be and are represented as functional blocks with signals flowing between them. Some are signals we care about, some are just housekeeping, like power.


In a PCB, islands of reuseable components (IC's) are connected in higher level idioms to accomplish some design objective. If an aggreation of chips (blocks) occurs frequently enough, the ensemble is turned into a single chip, just as the abstraction of many functional blocks can be replaced by a single block.


For a given PCB, the overall pattern is unique to the purpose of the PCB, yet it is composed of functional blocks which take on familiar and reusable patterns. These patterns repeat themselves and are therefore factorable into recognizable idioms, much like the letters, words, sentences and paragraphs of what I am typing just now.


The resolution of the similarity versus uniqueness question is given by the latter analogy. This note I am writing is a unique combination of reusable letters, words and sometimes sentences. But as the specific purpose of this note is articulated in increasing detail, it becomes increasingly unique. DNA is the same way. Four reusable symbols, CTAG, code for 64 possible sets of three (CCC, CCT, CCA, etc.) to specify 20 amino acids which polymerize (yes, people ARE plastic) to form proteins, which fold into useful configurations.
- Current Protocols


At increasing levels of complexity there seems less and less likelihood of reuseability. But in biological systems this is not necessarily true. For example a complex organ like the heart can be transplanted into another individual and work well enough to sustain life.
- ScienceDaily.com


Letters and words are reusable by necessity, lest we have the burden of continually reinventing writing and therefore communicating at the letter, word and sentence level and never end up getting anything useful done. It would be interesting to devise a language in which each idea is represented as a succession of new letters. But I will leave that glorious exercise to another.

- think smart designs blog



Something like a word processing program lives at the level of a PCB, while a font chooser lives at the level of a design pattern. But we need more than font choosers to make the world go 'round!


Design patterns are reusable, but at a meta-level, since they are not concrete. When we design a GUI certain patterns have emerged. The menu's are at the top, output and status information appear at the bottom. All languages read from top to bottom so this is quite natural. Is this because we look at the face before the feet of someone who greets us?


Consider the wonderful design documentary, "Objectified" which features Machines That Make Machines (MTMM). People by the way do this, as do all living things. A beautiful design is one that can be transparent in all its aspects and still be clearly discerned. The sheet rock in most homes, hides a network of two by fours, conduits, pipes, debris behind a facade of paint and crumbling gypsum. If a design is transparent then the machines that make machines should be just as beautiful and functional as the things they are used to make. Note that machines that make machines don't do so without first being created by their creator. In the "end game" only one level of machines that make machines are necessary because, like the functional blocks abstracted, the expression ((MTMM)TMM)TMM can be rewritten as MTMM.

- Science Clarified: AI and Robotics

Consider the assembly line staffed by pick and place robots. These are MTMM's. The CNC machines that work in concert to produce pistons, connecting rods and wheels are also MTMM's. The make technical artifacts like cars, washing machines, PCB's, etc.


Sections of assembly lines which bear a similarity to each other across technical artifacts are analogous to the design patterns of software identified by the Gang of Four. The parts they create are the reusable (think green) components, which can be recycled without further dis-assembly.


A principle emerges that at low levels of complexity everything looks reusable. A molecule, a resistor, an op-amp, a nut or a bolt. As complexity escalates, uniqueness increases. Thus reuse-ability of the aggregation becomes less and less, but not always, as in the case of DNA, where long strands are evolutionarily conserved, because that is the only way that the problem could be solved.


Two categories of tasks emerge: those that are reducible to repetition and those that are not.

- world of molecules.com


- whyfiles.org

Monday, August 16, 2010

Location-Based Radio



WHAT

I want you to imagine a new kind of radio - a radio where the station is selected by its geographic location rather than its radio frequency:

Traditional Radio Tuning
Instead of a radio with a tuning knob that slides across the band, imagine a radio that uses a global view of the earth and stars:

Location-Based Radio Console
Each station on such a radio would be a small icon, rendered on or above the Earth. The icons properties might indicate the station type, frequency and bandwidth.



A location-based radio is "tuned" by selecting the location of the station of interest. In the figure below, a terrestrial radio station labeled X is at one location in the spectrum. A second signal Y is a celestial source from a different spectral slot.



Another kind of tuning is possible - a radio station can be located by first clicking on its position in the spectrum and then observing its physical location.

HOW

There are several ways we might achieve a location-based radio. Naively, we might index a list of radio stations by their latitude, longitude and altitude. We might then create a program that would look up the location we selected and open that data stream, like for example, iTunes™ does:

iTunes Radio Listing


Such a user-interface, though convenient and even useful, misses a major conceptual opportunity. I propose a more fundamental and far-reaching implementation of location-based radio. One that is considerably more versatile, entertaining and enabling of discovery.

MULTILATERATION

The idea for a location-based radio was spawned by a very simple desire. I wanted a radio that would tell me whether a signal was of terrestrial or celestial origin. It takes several radios listening simultaneously to answer this question. If a radio signal is common to four or more receivers it can be uniquely located in three dimensional space using the Time Difference Of Arrival (TDOA) of the signals.

Image Courtesy Agilent Technologies


However, this required four radios instead of just one. That was bad. Perhaps if three other people could share the cost of one radio each, all could benefit. That was good. Running this idea to its logical conclusion results in the creation of a location-based radio network. We know that networks are powerful and enabling things.

In a location-based radio network, four is the minimum number of receivers, but in general more radios are better. More signals can be heard, more common signals can be found and more people can benefit. The effect compounds quickly.

For example, if a signal is of terrestrial origin, one might listen to it, or in the case of amateur radio, respond to it with a transmission. If the signal is of celestial origin, one might want to analyze it, discovering its location and structure. The former is entertainment, the latter is science. Science can be entertaining, and entertainment scientific.

SOFTWARE-DEFINED RADIO

Preceding my desire for a location-based radio was the advent of software-defined radio. Software-define radios eliminate some of the traditional hardware of a conventional radio. A portion of the physical radio is replaced by software in the form of digital  signal processing algorithms. A “soft radio” is a hybrid device, spanning analog and digital worlds, hardware and software. This introduces complexity but produces a payoff -  radio signals can be digitized and routed like other web-based media as TCP/IP packets. Such packets can be exchanged by constellations of location-based radios over the internet for comparison and analysis.

If these packets are time-stamped very accurately, they can be used to locate the signal. Time stamps do not occupy much space, so transmitting them from soft radio to soft radio does not create a lot of overhead. In fact, the original RF can be stripped away and just the audio portion of the signal retained - as long as the time stamps are accurately registered against the signal samples. This reduces transmission time and overhead significantly.

Digital signal processing, in the “cloud”, makes location-based radio possible. Software-defined radio enables large swaths of spectrum to be simultaneously processed and shared. So hundreds of signals can be observed and analyzed, instead of just one at a time, as in conventional radio. Location-based radio is to conventional radio like Google is to a filing cabinet.

BACKGROUND

During the second world war, there was a race to develop and use radio techniques to locate aircraft and naval vessels. The British innovated with a technique called multilateration. However it was the advent of GPS that makes location-based radio possible at reasonable cost.

CONSTRAINTS

One constraint on any radio is the band that it listens to. Software-defined radio now routinely spans AM frequencies of 530 kHz on the low end to 1.3 GHz plus on the high end. The radio spectrum from 1 MHz to 1 GHz spans four decades of frequency. Such bandwidth is within reach of soft radios currently on the market.

WHATS OUT THERE

Several companies already manufacture software-define radios that can be used as a starting point for a location-based radio network. A standard software defined radio (SDR) can be equipped with a GPS, so that its internal oscillator is “GPS disciplined”. This GPS discipline guarantees that digitized chunks of spectrum can be accurately time-stamped to within a few nanoseconds. Light travels a foot per nanosecond. So if a signal common to four radios is time-stamped accurate to the nanosecond, the location could theoretically be good to a foot. At this writing, fifty nanoseconds is an achievable goal and more than enough to provide utility. With larger networks of radios, techniques such as differential geodesy can be used to refine the location of the signal by observing common signals for longer periods of time.

IMPLEMENTATION

With just a few location-based radio nodes, one could resolve light-dimmers, microwave ovens, FM stations, satellites,etc. Knowing the location of a radio source is one of the most important ways of determining its nature. And knowing the location of the multitude of radio signals impressing themselves upon us every moment of every day, is not only entertaining, it is interesting science. It is also democratizing.

A simple design could be produced and distributed would enable a location-based radio capability to be grown from the grass-roots level. It could be based on existing SDR designs of which there are several to choose from.

CONCLUSION

Imagine being able to locate every radio signal that was impinging upon your existence, by working in concert with a network of similar web-based soft radios. This has far-reaching ramifications some of which, like the internet, may be difficult to predict or anticipate. One example is the ability to track meteors, to in effect, "Catch a Falling Star".

ACKNOWLEDGMENTS

This work is dedicated to my best friend and technology partner, Marilyn Fulper, who life on earth was cut short when a car ran a red light and struck her on her bicycle.



REFERENCES

Multilateration : Locating an Object by TDOA

L. Van Warren - A Blazing Fast Introduction to SDR

L. Van Warren - To Catch A Falling Star

L. Van Warren - A Short Trek to DNA-Cutter M87

Gerald Youngblood - Four part series on SDR


Wednesday, April 07, 2010

Fixing Java's Epic Fail

----
ACT I
EXT. NIGHT
A Grand Torino is parked in a vacant lot.
A slim figure with a gun towers over strange penguin-like figure,

"working" over a certain car. 




I will speak with words of one sound.
I will not raise my tone.
I will not take this one more day.

We had high hopes for you.
Day and night, night and day.
But you are a punk.
We took it and took it.
We thought it would stop.
We thought it would change but it did not.

Here is where we are.
You can change or you can die.
It is up to you.
So what will you do punk,
now that you are five and ten?


---

He was supposed to be objective, portable and secure.

"Leaks no more" they said. "Browser-ready".

We bought in and let you work with... with your GUI's, your API's, your IDE's. You took our days our nights AND our weekends.



You were slow. Twenty times slow.

Here's the lesson punk:

TRUST CANNOT BE AUTOMATED
 

Sometimes you need a person to MAKE A CHOICE.
Baseball has an umpire. There is a reason for that.


Here's your choice.


FIX THIS or the strange little man gets it:

1) Enable file I/O in any context, with MANNERS

MANNERS means ASK PERMISSION:
"Strange little man wants to write file X from author Y on your computer, is that okay?"

2) Eliminate security obstructions for developers on their own machines.

Deployment is a separate step, with MANNERS

3) Enable compilation to native code on all platforms.

Gnu fixed this, adopt it.

4) Enable pipes with MANNERS.

Got that Captcha?!

Sunday, March 28, 2010

Dark Flow and a Soft Radio Network


To the dark matter and dark energy mysteries, we can now add dark flow...

NASA’s Sasha Kashlinsky discovered a twenty degree patch of sky between Centaurus and Vela to which 700 X-ray clusters are being pulled at 611 miles per second. The significance of this is that it contradicts predictions that large-scale motion should show no preferred direction and that the motions should decrease at ever increasing distances. Kashlinsky posits that the source of the pull is "outside the currently observable universe".

I mention this because this gives us a patch of sky to which we can point our software-defined radios and perhaps observe something interesting.


The limitation is that our radios have to either be space-borne or in the Southern Hemisphere to get in on the action. Also this "dark flow' patch occurs out of the range of the Ukrainian radio telescope data visualized in a previous post:


It is my hope to create a network of Orbs - Soft Radios that can cooperate to locate celestial signals. Orbs are wide-band radios that downlink to the web using TCP/IP 802.11 protocols. Orbs talk to each other to using ham, astronomy, and ISM bands in real time. Their locations in space time is computed using GPS-disciplined internal oscillators.


Thursday, February 04, 2010

Game Theory: Socialism vs. Capitalism: The Hybrid Strategy


It always pays to articulate the obvious. Sometimes it pays to articulate the subtle. I am going to talk about Cooperation vs. Competition. About Socialism vs. Capitalism. About game theory.

You decide if my points have merit. If there is fallacy in my metaphor, please identify it. Please save me from even one more minute of erroneous thinking, because life is short. Let's begin.



I like teaching people to do new things. I like physical education because it keeps people healthy. My belief is, that if people are healthy, they will be happier. I like to make people happy. That is who I am. Who are you?

I was teaching someone to play badminton. I like badminton because it combines agility, alertness, quickness and stamina.

The birdie can absorb as much power as a strong person can generate, producing a pop when the birdie goes supersonic. But a person of small stature can also excel. The physics of the game, “levels the playing field”.

Thus this innocent and interesting game can be enjoyed by a diverse group of people. That is another thing about who I am. I advocate things that include diverse groups of people. What do you advocate?

I can discuss the physics of the game and the aerodynamics of the birdie, but that is irrelevant to my point so I won’t. What is critical about badminton is the way in which people interact and contribute to the achievement of the game.

One day I noticed that if a seasoned player faces a novice player, the novice quickly becomes discouraged. Unless they are able to participate in keeping the birdie going back and forth, the game isn’t fun. A self-important seasoned player might obtain some joy in dominating the newer player. But as newer player becomes discouraged the game stops, resulting in no benefit to either player.

Thus emerges our first principle:

“When players are unevenly matched, competition destroys the game.”

Now in terms of coaching, teaching or participating, there is another strategy one can take.

When players are unevenly matched, the responsibility of the stronger player  is to return the birdie such that the weaker player is guaranteed the possibility of returning it.

The consequence of this anti-competitive strategy is that the stronger player is now challenged to produce an exacting sort of shot, within the envelope of the weaker player’s skill. The weaker player now has the obligation to at least try to return this buffet platter of a shot. They are obligated to return the favor to the stronger player.

When this strategy is employed a very interesting thing happens. The stronger player begins to fatigue, because it takes more energy and more skill to deliver to the weaker player, this idealized shot, so that the game can continue.

If the game continues in the anti-competitive strategy, after awhile, it becomes the stronger player who withdraws because the demands of the game become so high. But the weaker player improves rapidly as a result of multiple successful returns.

Thus emerges our second principle:

“Anti-competition stresses the stronger player
while improving the weaker player”.

The game continues, but only for the duration of the stronger player's ability to endure.

I would say that these principles of competition and anti-competition should be obvious to everyone, but they were only obvious to me after fifty years of life, so perhaps not.

For the game to continue a new strategy MUST emerge. A strategy that takes the needs of both players into account. I call that strategy, the Hybrid Strategy. If you don’t already have it on the tip of your tongue, I will explain how it works.

Two players, a weaker and a stronger player start anti-competitively, enabling the game to be established, allowing the players to assess their position and skillset in the game. The weaker player becomes stronger and the stronger player (a measurable quantity by score…) eventually tires and calls for a strategy switch.

The players now engage the game in a competitive strategy. BUT, the players are now more evenly matched. The weaker player is now stronger, the stronger player is now tired. The game continues until the weaker player no longer wishes to participate, or the roles reverse from the weaker player becoming strong.

Now we have the third and most important principle:

"When both players consent to a strategy switch, the game continues."

The result? Improvement and value-added for both players. When either player does not consent to a strategy switch, the game ends.

Now in politics or government the metaphor can be applied as follows. Players can be Rich vs. Poor. Republican vs. Democrat. Brahman vs. Untouchable. High IQ vs. Low IQ. Strong vs. Weak. Coordinated vs. Clumsy. Citizen vs. Alien. Capitalist vs. Socialist Etc.

The hybrid strategy enriches everyone’s life to the fullest extent, and leads to the most important principle, “Reduction of Harm”. Reduction of Harm is a topic for another essay, but is quite useful in calculating those laws, ordinances and enforcements that are, in some global sense, best for society.

The selflessness of the hybrid strategy ends up benefiting both parties to the maximum degree.

We know this intuitively. How can we put it into practice?

Wednesday, January 06, 2010

Duals and Duels


Point 1: Maslow's Hierarchy





Point 2: Economica:
There was a movie, directed by Dr. Charles Venus, about economics. It featured flow charts showing goods moving from supplier to factory to customer - with money flowing the opposite direction. Credit and Debit, Electrons and Vacancies if you believe in the Solid State.

At the time, the critical necessities were Food, Clothing and Shelter. Which is great for a Cave Man living in the Stone Age. For those not living in solitary confinement an updated list is:

(Economic-Necessities
     Food Clothing Shelter Information
)



The last entry, information, implies communication, education, entertainment, PC's, TV's and cellphones.

Point 3 Duals and Duels:

ADC - DAC
The input to an analog-to-digital converter (ADC) is some measurable feature of the real world. The loudness of a moment in time perhaps. The output, quantized to a digital representation is a superposition of, "Yes, I heard it", or, "No I didn't".

A digital-to-analog converter does the opposite. It converts a digital number to an analog signal.

A good way to test an ADC is to undo it with its inverse, the DAC. One compares the real world to the real world run through the ADC-DAC combination. If the measurements are the same, you have a perfect ADC and DAC.

One can make a similar remark about DAC-ADC combination.

The idea is this: One can test a thing using its dual. The dual of the ADC is the DAC and visa-versa.
I find this simple idea to be one of the most useful ones I know.

VTF-FTV
A similar argument applies to a voltage to frequency converter (VTF) and a frequency to voltage converter (FTV). These devices exist on integrated circuit chips. The same chip can be run one direction to be a VTF and the other to be an FTV. It is a profound thing to convert a signal in the time domain, to a signal in the frequency domain. Each representation has its strengths and weaknesses. Moving a signal into the frequency domain makes some things, like filtering, very easy to understand.

HW-SW
Hardware (HW) and Software (SW) aren't really duals in the sense of the other examples. I have a friend who likes to move functions into software. I would like to see more things move into hardware. So we duel, in a friendly way of course. But you may be getting an idea here and I will leave you to that way of thinking.




Tuesday, December 15, 2009

Reading Past the End: The Knotted Universe


While completing a lecture I stumbled upon Robert Scharein's thesis:


which along with its companion software:


enables one to generate knots:



This is knot the end. Think of milk drops. Think of cymbals. Think of resonating loops or 'strings' as in Brian Greene and String Theory. Can a string ring in different modes? Can a knotted string be ringing in different modes? Two such modes, orthogonal?

I can build a circuit or structure that will ring in several modes. But can a circuit be knotted? A loop antenna exists at the electron level and at the macro level. Such loops communicate by radiating and receiving photons. Knotted fields of energy. Perhaps deeper issues in physics, space-time and dimensionality are connected via knots.

Knots appear in organic chemistry. Left-handed glucose gives cells energy. Right-handed glucose is useless, unless you live in a world that is the mirror image of our own.

Knots appear in DNA storage on histone coils where multiple levels of recursion enable a six foot strand in each cell.

Proteins that fold correctly function properly. There are protein folding diseases. BSE (Mad Cow), Alzheimer's, Huntington's Chorea and even cataracts.



The issues of three dimensional correctness are covered with amazing clarity and brevity in this knot thesis. Knots are symbols with structure and meaning, flying all around us.

There exist symbolic algebra, symbolic geometry and symbolic topology.

Understanding how the three relate will make life easier for the implementers and students of same. Thus a rudimentary understanding of them is essential in the set of thinking primitives we require. 

Monday, September 21, 2009

Delay Discounting Measures Operational Amplifier Gain



As a systems engineer and professional mathematician, I sometimes notice two completely different fields are governed by the same model, equations, or phenomenology. One might never guess that the decomposition of ammonia gas over a platinum screen, has the same underlying equations as freefall from an airplane. This observation is unlikely unless one is exposed to both contexts and happens to also know the fundamental mathematics. By coincidence or luck, sometimes one stumbles into seeing the similarity.

Recently I observed this as a test subject in a medical psychology experiment called "delay discounting". 








These studies characterize addictive behavior by attempting to measure a person's tendency towards impulsiveness and control of same. Impulsiveness is quantified by asking the test subject questions like, "Would you rather have $50 now, or $100 in a week?", on a sliding scale where the impulsive person will take anything in the here and now, rather than pie in the sky after some interval of time. Delay discounting behavior is a useful window into addictive and other human behaviors. Consider the show Survivor, where the contestants will pay $500 for a hamburger if they can have it right now, even if it means risking a million dollars in a few days.

In delay discounting there are four categories of questions. Gain and loss versus time, like the question above, and gain and loss versus certainty of reward.

The stunning (to me at least) observation is that after a subject answers these questions, they have effectively calibrated the gain curves on four specific operational amplifiers.

Amplifiers take small signals as inputs, and subject to variables like feedback, gain setting, and stability, produce large signals as outputs. There is a saying in electrical engineering that, "Amplifiers Oscillate and Oscillators Amplify". One may extend that saying to the world of digital filters to say, "Filters Amplify and Amplifiers Filter".






Without going into an incredibly boring tirade, let me just provide a few assertions that apply to the amplifier model.

1) Clusters of neurons sum their inputs to produce an overall action, and this is similar to amplifying a small input to produce a large output.
2) Different clusters of neurons, responsible for different activities, have different gain settings. The gain settings of these clusters can be imaged using techniques like fMRI and PET.
3) After measuring an individual's gain curves, one could actually predict (the thesis of the delay-discounting world) their propensity to engage in various addictive behaviors.
4) One an individual is characterized, one could simulate the behavior of that person with an analog or digital amplifier.
5) One could create an electronic implant to control addictive behaviors in willing individuals that are afflicted. (Implants in the unwilling are beyond the modest scope of this note!)

Addictive behaviors have a wide range of expression and involve substance and non-substance stimuli (gambling for instance). Basal ganglia disorders like Obsessive-Compulsive Disorders or OCD,  may also involve the brain centers connected to addition.

Whether addictive disorders are architectural from brain morphology, neurochemical from nerve cell receptor distribution, or both, I do not know. What I do know is that the amplifier model can be very useful for characterizing both the likelihood and the expression of the behavior in individuals for which the measurements have been properly made.

In that sense, delay discounting measurements provide what engineers call, "A Characterization of the Amplifier".
 

This would certainly seem a useful model for characterizing behavior, in my mind at least.




Ref: Eisenberg et al. Behavioral and Brain Functions 2007 3:2   doi:10.1186/1744-9081-3-2


Sunday, September 20, 2009

Boosting the Shuttle to Geosynchronous Orbit





Figure Created Using Geometry Expressions™



The Space Shuttle could (theoretically) be boosted to Geosynchronous Earth Orbit (GEO) using a Hohmann Transfer Orbit. It would require two burns. The first burn would increase the speed of the shuttle from 7.62 km/sec to 10.0 km/sec. After entering the transfer ellipse 5.3 hours would elapse until the second burn was necessary to circularize the orbit. Otherwise the shuttle would stay in an elliptical orbit. This second burn would produce a change in velocity of 1.45 km/sec, leaving the shuttle in GEO orbiting at 3 km/sec at a fixed point over the Earth's surface. The beauty of these two burns is that they are both tangent to the orbits, in opposite directions, and use the minimum amount of fuel.


For verification purposes I have included links to the two programs used for this analysis. It is important when driving spaceships to have consensus before turning on the motor.


Appendix: Calculations in wxMaxima™



Wednesday, September 16, 2009

Tracking the 100 Brighest Satellites

Robert Simpson created this Google Earth database for the 100 brightest satellites using the NORAD Two Line Element database. This database updates with changes in spacecraft position, which is useful. For FallingStar meteor-tracking work the following extensions would be useful:

1) Add trajectory paths.
2) Merge with NAVSPACESPUR so we can listen to crossings.
3) Add more satellites and sort these by type for amateur radio operations.
4) Add more information to placard display when spacecraft are selected.

Sunday, September 13, 2009

Hamtrak - Excerpt from Work In Progress


Added the ability to annotate RF sources streamed from data capture.

Lane Lickers Criterium - Sep 19, 2009


(click images to enlarge)


Meet at Cook's Landing, Ride to ADEQ Loop, Have Fun!

Saturday, September 05, 2009

Radio Light -


I’ve got hamtrak, my communications monitoring program, running more reliably. It listens on my soft radio and plots pins in Google Earth as amateur radio contacts occur. I wanted to know if there was bias in the reception I was getting due to geographic, antenna or electronic factors. I let it run for 11 hours. Then I compared the picture it produced with US population as seen from space:





For this small sample, the visual correlation appears representative.

Friday, September 04, 2009

A Solution to the North Rising Sun



Lately as I ride across the pedestrian bridge at sunrise, I have noticed the sun has been rising in the north. Having been informed that the always rises in the east, I found this perplexing. The trouble turns out to be the accumulation of two interesting factors.


1) The pedestrian bridge does not head due north, it is rotated 15 degrees towards the east. Picture:

So believing the bridge to be north-south was problem one.


2) The sun does not rise in the east. Tomorrow (9/4/2009) it rises exactly 9 degrees north of east. But back in July when I was first having the problem, it was rising 28 degrees north of east. As late as August 4, it was 21.4 degrees north of east. Moreover just before sunup, the sun is another couple of degrees north of east, when its light is beginning to fan out across the sky.


3) Accounting for the early light makes 30 degrees north + 15 degrees of bridge rotation, so the sun APPEARS to be rising at 45 degrees north of due east and that surely looked wrong. I noted this out fearing some sort of cosmological malfunction of my brain or dire state of misinformedness.


4) The sun does not rise in the east, it rises in the north east, in the summer and the south east in the winter. This is paradoxical since the winter sun rides lower in the southern sky as the northern hemisphere tilts further away from it. It rises in the east only one day of the year. This year that will be September 23 at 7 am CDT, a day after the equinox. After this the sun heads south of east for its rising reaching a of maximum southness of east of 28.6 degrees around the solstice, December 21.


5) Riding on the bridge, the sun will appear to rise in the east on Halloween morning at 7:15 am in a suitable tribute to my distress. The next day we reset our clocks introducing a new kind of biological confusion.