I am playing with a db design at the moment and an idea has occurred
to me that I would like to ask for advice / ideas on. The db is for
storing product information in a 'dual heirarchy' way. What this means
is that we have a normalised design with two 'paths' to a specific
product (Category, SubCategory, Segment)(Manufacturer, Brand, Range).
Each 'Product' has a unique id (the EAN code) as well as a unique
technical key (autonumber), and has two foreign keys (range_id and
segment_id). There are some standard fields that are applied to all
products, such as Name, Weight, Height, Width, Depth, and so on. There
are also different fields that would be great to be able to apply to
certain categories of product, for example food products would benefit
from having a 'Flavour' field, while books would benefit from an
'Author' and 'ISBN' field.
What I was thinking of doing was to have a user defined list of
appropriate fields and their data types, per category, so that
appropriate information can be stored for each without having to build
separate data tables per category. I have not tried something like
this before and was wondering if anyone has given it a go?
My initial thoughts run along the lines of having a table related to
the category table where the field list would live. What I cannot seem
to figure out is how to use SQL to return a list of products, with the
user defined fields (and their values of course), for a given category
so that each user defined field 'has a column of its own' so to speak.
I think this should be possible but I just have no luck in wrapping my
mind around a method to achieve this.
Any advice or ideas would be greatly appreciated.
The Frog
so you have something like
tblRange
rangeId (pk)
...
tblSegment
segmentId (pk)
...
tblProduct
productId (pk)
ean
rangeId (fk)
segmentId (fk)
now you need to introduce 'features'
tblRangeFeature
featureId (pk)
rangeId (pk)
feature
tblSegmentFeature
featureId (pk)
segmentId (fk)
feature
tblProductRangeFeature
prfId (pk)
productId (fk)
rangeId (fk)
featureId (fk)
featureValue
tblProductSegmentFeature
psfId (pk)
productId (fk)
segmentId (fk)
featureId (fk)
featureValue
tblRange:
-range_id (pk)
-range
tblSegment:
-segment_id (pk)
-segment
tblProduct:
-product_id (pk)
-ean (unique)
-product
-range_id (fk)
-segment_id (fk)
tblFeature:
-feature_id (pk)
-category_id (fk) (this is two levels 'above' range in the design)
-feature
In theory we can list all sorts of things here. My concern is in two
parts:
1) How to effectively control the set of features that are available
to an individual product without cascading a compound primary key down
the line (so to speak).
2) How to provide a query that for any given category we would see a
list of products with their 'heirarchical' fields as well as including
- column by column - all the defined features (ie/ each feature as a
separate field in the query).
I need to be (if I am going to do this) provide the output query as
pure SQL. It cannot be dependant on any VBA to work as the database
must 'play nicely' with some of our other systems. If I have to carry
a compound key then so be it, then use a joining table to simulate a
many-many relationship.
Does that clarify what I am trying to achieve a little better? I
apologise for any lack of clarity in my earlier post.
Cheers
The Frog
An entry I read recently in Tony's Microsoft Access Blog comes to mind
...
http://msmvps.com/blogs/access/archive/2010/10/07/bad-carma.aspx
WOW! If you don't read Bad CaRMa to the end and don't say wow (or
worse) then please unsubscribe from my blog. <smile>
I followed a link from fellow MVP Mitch Wheat's blog entry When Design
Goes Bad, and The 'One table to Rule Them All!'
--
Clif McIrvin
(clare reads his mail with moe, nomail feeds the bit bucket :-)
1)
tblFeature gives you a list of possible features for a product
ie. metadata
you could add fields like sequencing, mandatory flags, data type,
default value, etc
now you need a table, tblProductFeature, to assign values
to the features for a given product, ie. the feature data
tblProductFeature:
-pf_id (pk)
-product_id (fk)
-feature_id (fk)
-feature_value
2)
this is a hard one without VBA,
if it was a small set of known features, you could a query like this
pseudo query
select a.productId
, a.product
, b.featureValue as width
, c.featureValue as height
from tblProduct as a left join
tblProductFeature as b on a.productId = b.productId
and b.featureId = 1 left join
tblProductFeature as c on a.productId = c.productId
and c.featureId = 2
as you can see, you need a 'tblProductFeature' reference for every
feature
maybe you can have one query per category (assuming each category has
a
small set of features)
Thankyou for the SQL approach, it is exactly what i was looking for.
What I will do is add code to the form that allows users to 'add a
feature' to a category, and use VBA to generate the required SQL and
save it as a query specific to each category. Depending on the number
of records and features it might pay me to dump the results to a temp
table. Either as a query or a temp table the 'play nice with other
applications' part is taken care of.
Clif, thats an extremely funny article on developing the 'Vision'
application. I cant believe that anyone would be that stupid!
Apparently ego and ambition outweighed ability and common sense.
Sounds similar to my office........ Wait! Did I say that out loud?????
Thanks guys.
The Frog
Hi The Frog,
If a little bit of VBA is acceptable, you could make a csv-output,
readable with Excel, and (probably) 'other applications'.
Imb
Thanks for the thought. I am guessing that the IT dept here wants to
be able to get at the data without having to write any complex SQL
themselves. If I can get the userform to craft the needed SQL and save
it as a query in the back end file, then I am hoping that the IT dept
can then access it any way they wish. Basically they dont want to play
with a front end app at all and just want access to the data that is
going to be the product of a lot of user input.
The bigger picture is that the whole excercise is a data cleaning
process to produce a table of products that will have 'trusted'
definitions for their fields as well as the content of those fields.
Each row of data, and each field in those rows needs to be manually
approved. This is a big job indeed for the product lists we have, so I
am trying to ease the pain with a little application that can take the
source 'rough' data in, park it in temp tables, allow the users to
process the data, and place the result into a structured back end that
still leaves them some flexibility to alter things as the business
changes and grows. It will eventually move to a master data system and
the process will be discarded, but not until the existing lists are
processed and cleaned to a trustworthy state. It is proving to be a
surprisingly tedious and frustrating problem to solve, but by no means
impossible.
I really appreciate the feedback.
Cheers
The Frog
Hi The Frog,
With the explanation in your answer I can understand that an output to
a csv-file is to simple. As far I understand now is that you want to
update (changed) data in the database, so in fact your form is
"connected" to the data-tables.
If this is the case, then your form must handle ànd real data ànd meta
data (fieldnames, datatypes, etc).
I have a feeling that this is possible. In fact for all continuous
forms that I have in my applications, I use the same form. This form
contains no knowledge of real data or meta data, that is only done at
opening the form, and is very dependant on the specific environment of
that form on that moment.
So, if I have a better idea of how your application (or is it still an
idea?) runs, perhaps I can give you some hints how I solved such a
problem.
Imb.
That is a most generous offer. At the moment the application is still
in the 'paper' stage, but there is a need to build this. In short it
forms part of a larger process, primarily that of generating a type of
trusted data dimension for product master data. In turn this master
data is used as the base of several data warehouse applications and
systems. The current issue is one of data cleanliness. The input is
basically a 'minimum effort' system that is almost wholly dependant on
an external data supplier who is not beholden to us in any way for
data quality (just for the record I was not the one who negotiated
this suppliers contract with us).
So, the idea is to take an ETL process (Extract Transform and Load)
and validate and restructure the data into a form that matches the
business needs. This will be done by Pentaho Data Integration and it
does its job well. What it cant do is tell you if something is 'real'
or not. For example, does Brand X really exist or is it just a
mispelled Brand Y? You can get close with heuristic lookups and
comparisons, and I will be using Levenshtein distances etc to find
nearest fits when a fields value is unknown - but this is only a
guess. In the end a user must make the decision and 'approve' the
fields value. This can end up being a lot of work, so I need to do
some shortcutting to make life easier or the system just wont be used.
This is where the user controls start to come into play.
For each category of product(s) there are necessary 'features' that
are pretty much standard, such as EAN codes or part numbers, a name,
manufacturer etc... So with this in mind I have built a series of
normalised tables to represent the standard components of most any
product:
Category->SubCategory->Segment \
Product->Variety
Manufacturer-> Brand-> Range /
Attached to each of these tables (except for Variety) is what I call a
DS table. A DS table is short for DictionarySource. It is a table that
uses the PK of say Category as a FK, has a field for identifying the
data source (like a point of origin), a field for the 'incoming' term
from that data source. In this way known terms can be matched quickly
and the ETL processing can handle more and more of the data processing
'automatically' so to speak. The result of the ETL will be a list of
'incoming' rows with the source field terms and the known (or guessed)
'output' terms. This is where the user defined part is bogging me
down. Handling the normalised structred part is relatively straight
forward and logical. Allowing the users to define fields per category
that are applied at the Variety, and matching those fields to incoming
fields that are available is proving tricky. I figured that a meta
data approach would be about the only way to do this, so I threw the
idea up here on the newsgroup to kick it around and see what came of
it.
Now that you have the full picture is there anything you might suggest
to ease the pain?
Cheers
The Frog
Hi The Frog,
I have printed your answer, so I can reread it a couple of times, in
order to understand what you mean.
But every time I fall over the _’incoming’ term_ and _’output’
terms_. According to my dictionary the word term has many meanings,
and I do not know whichoine to choose.
But your problem reminds me of an application that I build for a
distribution company.
They wanted to have a database with the (selected) products of the
suppliers. The source for this information was mostly Excel-sheets or
Word-documents.
As in your case, the products had a couple of pretty standard
attributes, but also many different features, that could be very
different for different products. This could be the color, length,
unit size, etc., with hardly any structure in featurename, especially
not between the different suppliers.
To handle all these different features I had a table with Product_id
(FK), Featurename, Featurevalue. To bring some structure in all
different Featurenames I added a field CompanyFeaturename_id, FK to a
table with the defined CompanyFeaturenames.
At loading a new catalogus from a supplier, Featurenames and
Featurevalues were filled in, but not yet the (standardized)
CompanyFeaturename_id. This empty CompanyFeaturename_id indicates that
some human interaction has to be done.
I can imagine that a CompanyFeaturename_id can already be produced by
a reference table with the relations between Featurename and
CompanyFeaturename form the past, or a best guess with whatever means.
In the latter case you probably need an additional boolean field
“checked”.
Is this comparable with your problem, or am I on a wrong track?
Imb.
You are on the right track. The 'incoming' data is referring to the
source information that comes from a third party supplier and is the
data that is going to be processed. The 'output' is the result of the
process or to put it another way the definition of a product itself
and its features. The idea you have suggested is along the lines I
would need to implement, and I will experiment with it later today.
Thankyou for the guidance.
In doing some further research into the product definitions I have
learned that a lot of fields can be added directly to either the
product table or the variety table, and simply using a query to return
the fields needed for a category (ignoring fields we dont need).
I am wondering if I might approach the problem differently, by
defining many many fields for the product and variety tables, and
simply having another table that allows the users to select which
fields belong to each category. Its not perfect but it should be much
faster and easier to implement. Some human decision making is still
involved but it is minimal. It also makes target fields for ETL
processes much better defined and therefore less prone to error,
especially when it comes to handling data types.
I will some some experimenting later today and let you know the
results.
Cheers
The Frog
I'd sure be tempted to have lots of empty fields in the Item table.
With a field on the category table stating what type of fields should
be displayed. Then create subforms and subreports on the type of
fields. Then make visible the appropriate subforms and subreports
depending on the category type.
Yes, this is the standard boring approach. But when you consider
you've got to take into account field widths and order on the subforms
and subreports, along with lots and lots of extra code and logic,
boring is good.
If the above is making sense.
How many different sets of subforms and subreports do you anticipate?
Tony
--
Tony Toews, Microsoft Access MVP
Tony's Main MS Access pages - http://www.granite.ab.ca/accsmstr.htm
Tony's Microsoft Access Blog - http://msmvps.com/blogs/access/
For a convenient utility to keep your users FEs and other files
updated see http://www.autofeupdater.com/
Actually subforms and subreports might not be the right solution.
Groups of controls with tags which are visible/nto and moved might be
the best answer.
I am expecting between 7 and 10 categories in total. I was thinking of
a field list of available 'features' with about 25 different fields.
Most of the products being dealt with are food based so the required
'features' are almost the same with only a few exceptions.
If I was to build the generic model based on a meta-data driven
approach I am not sure that the return for the effort is worth it in
this case. On the other hand having a meta-data driven set of forms
for the different user interfaces might be a great idea. I could add
in a few 'generic' fields that could be re-purposed by users to hold
simple data like strings or numbers, and a table to keep the necessary
descriptive info similar to imb's suggestions above, but just for the
'name' the user wants to see the column as, as well as the proposed
data type. To get the data back out I would have to have the user form
that allows the customisations also generate the required SQL for a
query to act like a view that returns all data for a category (or
maybe just save the SQL in a table in the BE - depends on what the
external applications want).
After doing some basic data cleaning to get a better grasp of the
situation I have managed to determine that from the several million
rows of incoming data there are only about 40k rows of 'actual'
original data. There are many repeats in the incoming lists, and this
will have to be handled by the ETL application -> the end result is
that the app I need to build is going to do the final 'read in' of new
product data, allow users to 'approve / discard' or alter the incoming
data, and provide a comprehensive 'view' of that data available in the
BE. Access will be the BE for this as the data needs to remain
portable and easily interfaced with other Office applications. If all
of this works to plan then the various 'macro's' and functions that we
(and by we I mean me) build should reduce many hundreds of man hours
worth of work down to only a few minutes. Makes me glad I developed an
Access report to PowerPoint slide function - the test report for this
reduced a week and a half / two weeks worth of manual work down to
about 1 minute. The next is far larger in scope and wholly dependant
on this app (and the ETL that feeds it) to do its job properly and
efficiently. I think this is one of the best office projects I have
worked on in years.
Cheers
The Frog
Hi The Frog,
With interest I read Tony’s remark, and was triggered by “boring”. For
me boring has a negative meaning, and so I did not understand what
“boring is good” means. But it did not matter anymore after reading
your comment. Apart from knowing how you things will do, I see now
_what_ you want.
Let me tell you my story. Long time ago when I started with my first
Access-form, I had to define all controls one by one. And I had to do
the work over with the second form, and the third. Really really
“boring”. And it did not matter whether these were forms or controls
or reports or procedures, “the second time” is one time too much.
So I started to think how I could parametrize all these small
processes. And in fact this was the start to structure my applications
through the adding of meta-data.
After many, many, many iterations now all my applications run by meta-
data, and I reverted the process of making applications compared to
standard-Access. Just adding the meta-data is sufficient to create
anything in the application.
So adding a new name to the “database-of-databases” creates a new
application, including FE and BE, totally functional, with meta-data
tables, but not yet the user tables.
Adding a tablename to the locally “table-of-tables” creates the table
in the BE, including links, PK, forms, relations, etc.
The same holds for the fields in the “table-of-fields”, whereby the
behaviour of the field can be described in detail, including preferred
datatype (with more possibilities then the standard Access-datatype),
visibility, edit levels, etc,
It is like the Bible: there will be a database …, there will be a
table, …, there will be a field.
For all continuous forms I use one and the same “Overview_form”, for
all single records one and the same “Single_form”, and for new records
also one and the same “New_form”. The standarized forms contain all
the functionality that is needed in these forms and are populated
through the meta-data, including the creation of the Recordsource,
dependant on the context in which this form is opened, so no need to
have any task-bar of ribbon.
User-flexibility is just changing some meta-data. Very important
advantage is that users do not need to change anything in the FE, no
changes in the definition of forms, so no bloating at all, and - until
now – no corruption.
Even changes in the data-definition of the BE are controlled by meta-
data, so that the changes in the “development” part are automatically
transferred to the “production” part, when installing a new version.
The other side of coin is that it is no standard Access anymore, and
much of the standard functionality of Access is not used anymore. No
QueryDefs, no Macros, no Reports, but that is a different story. It
seems to me that many contributors to this discussion group don’t like
that approach.
But I love it, and my users too.
Imb.
Access' capabilities certainly generate all kinds of possibilities for
abstraction. Using "meta-data" as you have is one of those
possibilities. The PowerPoint conversion through software, though
involving less abstraction, is another. You don't need to be too
concerned about not having QueryDefs, Macros or Reports. I use less
QueryDefs these days, only the AutoExec macro, and am more likely to
output a PDF file than an Access Report. You also shouldn't be too
concerned if few developers follow your example. You've put it out
there and we can decide how much, if any, of your technique to adopt.
I'm sure that Microsoft realized that people would use Access in ways
they didn't totally anticipate. The Frog's initial idea of a dual
hierarchy is not that uncommon. In manufacturing, it is often
desirable to track Sales Orders (jobs) by both network activity
(similar to Microsoft Project tasks) and by department/subdepartment.
The main point is to have a reason for how things are set up, along
with a plan to ensure that the initial setup will be robust.
The conversion to SAP where I work resulted in nearly every process
becoming slower than those using the Access database, but upper
management likes SAP's features so much that they are willing to put
up with angry customers whose projects are delayed, angry
manufacturing processors who can't order things as quickly, and even
the management rankling need to hire additional employees to handle
the additional work it caused. That was all after spending hundreds
of thousands of dollars on robustness consultants, some making in the
neighborhood of $700/h (after spending $1M for the SAP licenses). I'm
waiting for the the opportunity to customize things to speed up the
processes, but politics are involved and no connections strings are
allowed at this point into the SQLServer backend being used for SAP.
There are some native functions in SAP that make sure that certain
changes are consistent, but even read-only views of data through
connections strings are not allowed. SAP data for legacy Access
systems have to come from views within SAP that must be manually
exported to a spreadsheet from SAP before being imported into Access
via form code. Having an evolved way in Access for form creation and
report generation is way down on my priority list at present.
James A. Fortune
CDMAP...@FortuneJames.com
My vision for workflow is to keep documentation in synch with the
software for ISO certification purposes. Management's vision for
workflow is to hold employees' feet to the fire.
Seems to me this app could just as easily have been built in VB6 then.
Now what happens if you have some code to check various field values
to correctness?
Hi Tony,
I think your right about about VB6, and a lower priority reason to
take this direction was to – finally – become Access-independent.
But, Access is very handy for that few forms that I use, for storing
the data in tables, and most important for manipulating all the
different applications from a “Database-of- databases”. Moreover, my
target users are mostly small scale (social) organisations that have
no budget for commercial software, but they have Access installed.
About you question on checking various field names to correctness I
would like to have some more information, because it is hard for me to
really understand what you mean.
Imb.
I have lots of code in various places behind forms that check for
valid data or warn if there's a problem of some sort. Or that
enable/disable/visible/not visible controls.
Tony
--
Tony Toews, Microsoft Access MVP
Tony's Main MS Access pages - http://www.granite.ab.ca/accsmstr.htm
Tony's Microsoft Access Blog - http://msmvps.com/blogs/access/
So you have a database of 'databases', and you keep in this the
designs for the various form layouts, table designs etc... for the
different applications you create. This sounds like a really excellent
idea. To make changes you would just need to alter the design once in
your 'master', and the clients FE/BE are then going to use this data
for building the forms they need.
Extending this logic further, you could I have the required meta-data
for any 'object' inside Access, such as reports and queries, and do
the same as for forms. Extending this even further, if the entire
application is abstracted in this way you could post a website with a
databae and tables to contain the meta-data and have a single FE that
goes to all clients. The FE would check the website (or perhaps an
update file) for the required meta-data, perhaps based on a license
code, and thats all!
Did you have to extend your VBA IDE to achieve this meta-data
approach? Did you build a 'special' Access application to give you the
functionality you need? Does your FE also work in the Runtime mode?
Have you found any problems that would stop you from using this
approach for any reason? Can you share an example of the meta-data you
use for a form?
This is a most fascinating set of possibilities for Access that I had
only half believed might be possible. To make Access into a RAD
environment in this way is a very powerful possibility. Combine this
with the ability to work with 'Big Iron' databases as the BE (eg/
Oracle, SQL Server, MySQL, PostGres, etc...) and you have a very
powerful platform indeed.
Cheers
The Frog
Hi Tony,
Thank you for your explanation.
Yes, I also have checks for valid data. That is done on different
levels.
First is checked whether the data is of the right datatype, so that it
can be placed in the table. There is also already a length-control.
Second there is a check whether controls must have a value.
Third a control can have properties or features, which dictates some
conditions for this control. For instance a date must be in future, or
in the past, or after another date, or it must be a Saturday, etc. Or
a zipcode must be available in a zipcode-cataloque. And many more of
such conditions.
Fourth - and I am working on that now – a signaling wether data will
give conflicts with uniqueness of the data, not only for new records,
but also for changed existing records.
And fifth, al others cases are covered by local code in FE, probably
as you have.
When any of these conditions are not met, you can not leave the
control, except by giving valid data, or giving a Null-value. In the
latter case, if the control is obliged (good English?), you can not
store the data.
Depending on whatever you want you can enable/disable/visible/not
visible controls.
But remember, Tony, this in not hocuspocus. It is only generalizing
your routines so that they can run with the information that is stored
in the meta-data table that describes the behaviour of that control.
Once generalized, it can be used for any control in any application,
without reprogramming, and as a consequence because it is already
tested many times, robust.
I laughed at that article. It was a great read. My goals are not so
lofty and ambitious, but rather more focussed at the moment on a few
fields and maybe a form or two to play with those fields. 'Vision' is
the sort of project you give to academics to debate about while we
just sit back and enjoy the show.....
My interest in the forms, reports, etc is to ease the maintanance
aspect from a distribution point of view. Tony's AutoFE Updater is a
great tool, but for some scenarios with disconnected users (read as
Field Sales Force) it can be difficult to ensure complaince. Shipping
meta-data is a simpler problem to solve, but in itself creates other
problems. So I am just kicking around ideas and enjoying the thread
and other people points of view and experiences.
Cheers
The Frog
Hi The Frog,
You must still be young and dynamically, to go so fast with your
thoughts, and already extrapolating to a “world”-system.
I am a little pragmatic. I know where I am standing now, and I have a
problem to solve.
If I have already an answer for the problem, then I am ready. In the
other case I may conclude that the my systematics are not yet good
enough and need to be adjusted. Any in some cases I am simply not yet
that far.
Now your questions. I will answer per paragraph.
1. The “database-of-databases” is a utility that can inspect all meta-
data tables and all modules in all the FE of the different
applications. If I have to change something in meta-data tables or in
the systematics around these tables, than I can determine in what
application and where it can have consequences. It is only for
development purposes, and for starting a new, clean application.
All applications are linked to a library-database, that contains of
the knowledge on meta-data and how to process this.
2. Every client has his own very specific FE! The FE of each
applications contains filled meta-data tables, the links to the BE,
and copies of the standardized forms, and modules for very specific
code. But the structure of the meta-data tables is exactly the same in
all applications.
On this moment I do not use QueryDefs nor Reports, and extrapolating
is one step to far.
3. Until a half year ago I used VBA from Access98, because there was
still one client using that.
So nothing special, for me it is more “a way of working”.
4. Yes, the possibilities with this approach are fascinating! As in
Access2007 nowadays, each date control had already in (Access98) a
calender added, with more flexibility than is in Access2007.
Another very powerful item is that each control van have its own
dynamic menu with only the actual valid possibilities, depending on
authorization and context. As I answered to Tony, it is not magic, but
just code, but very generalized.
And about RAD and the rest, before I am so far I have still some other
work to do.
Imb.
<snip>
> When any of these conditions are not met, you can not leave the
> control, except by giving valid data, or giving a Null-value. In the
> latter case, if the control is obliged (good English?), you can not
> store the data.
imb -- I suspected early in this thread English was not your first
language ... I'm much impressed with how well you are able to use my
language. To answer your question here, I would have used "required"
instead of "obliged" ... in any case, what you wrote made your point.
As I understand of your approach, the application develpoment
environment you have created required much discipline of thought and
design to achieve the robust generalization that you speak of <smile>.
As The Frog said, I'm enjoying the conversation in this thread.
A future enhancement will handle disconnected operations quite nicely.
And update the FE as soon as they are next connected to the network.
But it'll be a while before I get there.
Tony
--
Tony Toews, Microsoft Access MVP
Tony's Main MS Access pages - http://www.granite.ab.ca/accsmstr.htm
Tony's Microsoft Access Blog - http://msmvps.com/blogs/access/
I had another thought with regards to the updater after mulling over
the content of this thread. I have seen and used some java based
applications that are (I believe anyway) based on either Eclipse or
NetBeans (using them as platforms for the application kind of like a
shell to put your content in), and it dawned on me (facepalm moment)
that this is almost the same concept as Access. I then thought about
the meta-data side of things, and came to an interesting possibility -
what if there was a way to use the AutoFE Updater to not swap out FE's
per se, but rather to extract the meta-data from a 'master' and
implement it into the clients FE? This would allow posting the meta-
data to a website, as text (for example) and distribution becomes a
non-issue (at least in most cases). Maybe I am reaching too far, but I
thought that it was worth passing on.
Cheers
The Frog
Cheers
The Frog
Thankyou for sharing. I appreciate the effort you have made both in
coding your application, communicating in English, and the time taken
in contributing to this thread. Thankyou.
Out of curiosity, what is your mother / native language? As Cliff has
said, you are expressing yourself extremely well, and we appreciate
the effort.
I have one last question for you: How long did it take you to achieve
a stable codebase?
Cheers
The Frog
(and yes, I like to think of myself as young and dynamic - my wife may
have a different opinion!)
Hi The Frog,
If your wife’s different opinion is old and dynamic, then she took the
best of all.
My native language is Dutch, so on the moment we are almost
neighbours. Sometimes it seems that my second language is VBA.
The time it took to reach this point is hard to say. It is like an
evolution in small steps, a learning curve, with a steady but
exponential growth. So the time it will take to make such an
application is very much dependant on which level you step in, or in
fact what you have learned already.
The heart of the system is the reference database, with about 20,000
lines of code and the basic forms. In total an Access database of 20
Mb.
But the number of lines is the not crucial point (a former boss wanted
to measure “performance” in terms of produced lines of code), it is
the effectivity of this code, without redundacy and maximized to re-
usable code.
Looking back, the whole process was not more than an investment in
structure on how you will manage your applications (i.e. meta-data),
and after each step of improvement you “feel” already its result. And
many small steps make a beautiful system.
Steady going further step by step.
Imb.
Seems to me that is basically replication. Metadata being, by
definition, in tables.
Tony (One of my standard short replies. <smile>)
Hi Tony,
You are right. The meta data is in the tables. The rest is meta
information: the way how you organize the process flow.
Alas, I'm not so puristic in my English.
Imb.
Hi Tony,
After re-reading your post I concluded that you did not understand the
clue.
Sure, meta data is stored in tables, but putting one or more tables
with meta data in an application does not do anything. You need
software, and sometimes lots of software, in order to let your
database run, defined by the data in the meta data tables.
Imb.
> On Thu, 9 Dec 2010 01:27:58 -0800 (PST), The Frog
><mr.frog...@googlemail.com> wrote:
>
>>What I was thinking of doing was to have a user defined list of
>>appropriate fields and their data types, per category, so that
>>appropriate information can be stored for each without having to
>>build separate data tables per category.
>
> I'd sure be tempted to have lots of empty fields in the Item
> table. With a field on the category table stating what type of
> fields should be displayed. Then create subforms and subreports
> on the type of fields. Then make visible the appropriate
> subforms and subreports depending on the category type.
I would argue against it. I have a very old app that has different
record types, where different fields mean different things, and it's
been a disaster to maintain for a whole host of reasons. While I
think you're suggesting dedicated fields for each record type (as
opposed to re-using a single field in multiple record types that
would then mean something different for each record type) and that
avoids some of the problems I've encountered, it still presents a
lot of issues for user interface in Access.
This is a version of the supertype/subtype problem and after long
experience experimenting with different ways to to implement it,
I've decided it's better to just make different tables when there's
lots of non-overlapping attributes, and forego the benefit of having
everything in a single table.
--
David W. Fenton http://www.dfenton.com/
contact via website only http://www.dfenton.com/DFA/
> Looking back, the whole process was not more than an investment in
> structure on how you will manage your applications (i.e. meta-data),
> and after each step of improvement you “feel” already its result. And
> many small steps make a beautiful system.
ibm :-),
I once tried using a table as a data map. Each field on the form was
of the type <ctl><FieldName> (e.g. cbxFirstName) for a field called
FirstName. The input could come from any table/field. The schema was
something like:
ID AutoNumber
ControlName Text
NewData Y/N
NewTable Text
NewDataType Text
FillFormExistingData Y/N
SubmitData Y/N
SourceDestinationTable Text
SourceDestinationField Text
SourceDestinationDataType Text
Each ControlName in the table would be read when opening, editing or
submitting data (note: this was for an unbound form). The table
controlled the default data to be used for new records, where the data
came from for old records, and where edited data would be saved. The
form interacted with about six tables at the same time, so the
flexibility of having a data map table was helpful. I have tried many
kinds of generalizations with Access, with varying degrees of success.
James A. Fortune
CDMAP...@FortuneJames.com
Hi James,
Nice to call me ibm, but alas, I am not a shareholder.
As far as I can recall, you are almost the first who reports on his
generalization efforts, though my feelings are that everyone who is
more or less experienced in Access heads for that direction. Is there
a taboo or is it a matter of “intellectual property”? Or simply too
difficult?
I tried to understand the model that you build, but it is hard to
understand it in depth. There is a couple of similarities, there is a
couple of differences.
The most important difference is that I use bound forms, that means
that I do not need to distinguish between ControlName and
SourceDestinationField.
I use a DataType, but do not differentiate to NewDataType and
SourceDestinationDataType. This DataType is a little more subtle than
the Access datatype.
This table - as the CollectionOfControls - contains also all other
information that I need to define this control in its specific
context, such as dimensions, edit and delete authorization, default
sorting, duplicate record checks, actions after modification, and
more.
I have four important forms: one for adding new records, one for
inspecting/editing single records, one for Overviews (continuous form)
and one form as a “Selection form”. This latter form is a special case
of an Overview form, and replaces all Comboboxes anywhere in the
application.
Your most interesting remark was, that you tried many kinds of
generalization, with varying degrees of success. I like to hear those
attempts that did not result in the wished success, because from these
attempts you can learn the most.
Imb.
> As far as I can recall, you are almost the first who reports on his
> generalization efforts, though my feelings are that everyone who is
> more or less experienced in Access heads for that direction. Is there
> a taboo or is it a matter of “intellectual property”? Or simply too
> difficult?
Access certainly creates an environment that facilitates
experimentation. There has to be a critical mass before
generalization is justified. For some, day to day pressures don't
allow them to set up a properly abstracted database for a given task,
even if they know that's what they should do. For some things, it
takes a substantial familiarity with Access even to know what's
possible. Code that automatically writes code has been largely
abandoned based on painful lessons shared in the Access newsgroups.
But when I determine that a more abstract way of programming in Access
will provide benefit and implement it, I am happy to share the fruits
of that labor with others, along with warning signs for the gotchas.
>
> I tried to understand the model that you build, but it is hard to
> understand it in depth. There is a couple of similarities, there is a
> couple of differences.
> The most important difference is that I use bound forms, that means
> that I do not need to distinguish between ControlName and
> SourceDestinationField.
> I use a DataType, but do not differentiate to NewDataType and
> SourceDestinationDataType. This DataType is a little more subtle than
> the Access datatype.
> This table - as the CollectionOfControls - contains also all other
> information that I need to define this control in its specific
> context, such as dimensions, edit and delete authorization, default
> sorting, duplicate record checks, actions after modification, and
> more.
>
> I have four important forms: one for adding new records, one for
> inspecting/editing single records, one for Overviews (continuous form)
> and one form as a “Selection form”. This latter form is a special case
> of an Overview form, and replaces all Comboboxes anywhere in the
> application.
The basic idea of using metadata is sound. The idea about many
elegant solutions implementing another level of abstraction is fine as
long as the cost of implementing that level of abstraction doesn't get
too high.
> Your most interesting remark was, that you tried many kinds of
> generalization, with varying degrees of success. I like to hear those
> attempts that did not result in the wished success, because from these
> attempts you can learn the most.
A few come to mind immediately:
Flexible Reports:
http://groups.google.com/group/comp.databases.ms-access/browse_frm/thread/e5f9ed5253e232c1
The object of the flexible report was to be able to select up to ten
fields from a table to include on a report and have the report code
automatically size the data to make use of the available space on the
form. I had some properly normalized tables with over 70 fields each
(the data was related to aerospace manufacturing). If the maximum
data length got too large, it would shift to a different report to
show the data, first to landscape, and then to legal, if necessary.
The users really liked having that much flexibility, but the setup was
lengthy and it was not particularly easy to add another new field
possibility to the reports. After performing a search, a new form
with all the table fields with checkboxes by each one allowed the user
to add or delete fields shown in a listbox. The previous list came up
as the default and users could also save or retrieve a given list of
table fields. The report would include only those fields with only
the records that conformed to the search results.
Creating form from code:
http://groups.google.com/group/microsoft.public.access/msg/0fd179f25848911f
I was trying to create search forms automatically from Access tables.
This had a lot of promise because users often want such a useful form
created and they really love using them. The code had two main
problems. The first was that it was "blue-collar" code that wasn't as
elegant as what some others started to post about their initial
efforts into doing something similar. The second was that given
enough fields, simply putting each new control on the form vertically
reached the 22" limit on the form height. It's not that hard to add
logic to handle that, but I never got around to dealing with it.
Also, there was the problem of putting in, via VBA, all the correct
VBA code to dynamically generate the SQL required for the search
behind the command button. That seemed straightforward, but time
consuming. The bottom line was that the search forms I had already
created kept getting more features and there wasn't enough demand to
create totally new search forms to justify finishing that effort.
Is it advantageous to use .NET Framework DLL's in Access?
http://groups.google.com/group/comp.databases.ms-access/msg/56476f5e582e43fb
There's a host of .NET enabled additions to Access that I would've
liked to investigate. With Access being cut back where I work I have
little justification to make those investigations. The new owners
still give me fairly wide latitude and rely on my discretion when it
comes to investigating new technology, but I try to keep my forays
focused. Maybe it would help to give out the CEO's email address and
ask people who have benefited from my posts to send a thank-you
note :-). I am spending a significant amount of time getting up to
speed with C# and WPF, since those are the most likely avenues, if
any, for modifications to SAP. I have read through many books about
either C#, WPF, Concurrent Programming, .NET or LINQ during the past
year. I'm not sure how much information will make it back into
creating enhancements to Access. I like what I've been able to do
with Access so far, but I also need to develop at least similar
capabilities with other tools. I have a clearer picture right now
about what technologies I will need to use later. That is helping me
focus my efforts.
James A. Fortune
CDMAP...@FortuneJames.com
The art of simplicity is a puzzle of complexity. -- Douglas Horton
Yup. I was thinking later that my reply was quite wrong in that, as
you point out, I hadn't thought things through completely.
Tony
--
Tony Toews, Microsoft Access MVP
Tony's Main MS Access pages - http://www.granite.ab.ca/accsmstr.htm
Tony's Microsoft Access Blog - http://msmvps.com/blogs/access/
oh.. great.. it's obvious that you don't understand anything about normalization!