On 01/29/2010 07:24 PM, Shawn Green wrote:
> Rudy Lippan wrote:
>> On 01/29/2010 02:57 PM, Chris W wrote:
>>> Hardcore stupid if you ask me. I suppose it is "possible" to have a
>>> valid reason (can't imagine what it might be) for using more than 61
>> How about complex data requirements? Depending on the resolution of
>> your data set, I could see a "simple" person-type object that contained
>> name, address, SSN, mother, and birth_info starting to approach the
> In a simplified Object-to-Database map, most object types (classes)
> equate to a single table. Each table will contain several columns. Each
> column will represent one particular property of the object. For objects
> that contains lists of sub-values or sub-objects, you use another table
> (usually called a child) related to the first (often called a parent).
You are speaking to a simplified mapping; whereas, I was suggesting one
valid reason for needing many tables in a join, viz., you have a
requirement for *high resolution* data set.
Maybe think of a genealogy-type database where someone might have a good
reason inquire whether 2 people lived within 5 miles of each other
between July 23, 1843 and September 18, 1858. Simple right? One little
query. But now, let us add that the city was renamed 4 times, lost in a
war to another country, recaptured, annexed by another city which was
eventually dissolved for lack of tax revenue.
Or to get a few more tables in the mix: Could those people have lived on
the same street, same block, within 15 houses? And let us not forget
that the street has been renamed many times, renumbered a few (using
different numbering schemes), some properties were subdivided, and two
adjoining properties were held, at one point, in single and separate
ownership but ended up being purchased by the same person during a time
when the zoning laws forced a merger.
Sure it is overkill for your shopping cart, but not for my database of
all worldly knowledge :) It is just a matter of how you look it.
BTA, if you were writing your shopping cart for a genealogy website that
had the above database, you might just create a view, city(city_id,
current_name), and use that id when storing user/credit card info.
> OK, after this last statement I will cut you some serious slack.
> However, and I hope you agree, unless someone is using some rather
> obscene normalization, most queries should not require joins of more
> than 10 or 12 tables to resolve.
Or using multiple imported data sets that are each normalized, or using
a code generator, or, or ,or. In general, I agree, but only in general.
> My personal thumbrule is that if I have more than about 7-9 tables in a
> single query, I should probably attack the problem in stages. I do this
> because the physical act of logically (internally) representing all of
> those columns across all of those row permutations in memory can become
> a burden to process.
Here you are talking about working around limitations: Either yours or