Wednesday, December 14, 2016

If You Still Need it, Here's Even More Proof of Life for Our Lovable Little Friend -- Access


This means Businesses moving to Office 365 can continue to use the most popular Relational Database Design Tool in the world without jumping through hoops to do so.

Obviously, it won't happen overnight, so your patience is still needed, but as the announcement says:

"How soon you'll see Access as part of your installation depends upon the Office 365 update channel your admin designated for your subscription. For Access users with Office 365 on the Current Channel, you'll see Access added to your installation in December 2016 or January 2017. For Access users with Office 365 on the Deferred Channel, you'll see Access added to your installation in June 2017. To find out what update channel you're on with Office 365, go to File > Account within one of the desktop applications. Your update channel is listed above the version number."

Those of us in the MVP community have been crowing publicly about the changes we see over in Redmond for a while. Proof of the validity of that optimism is rolling out on a regular schedule these days.

Bookmark Office Support at MS to keep up-to-date with the enhancements coming our way.


Saturday, November 19, 2016

Free Stuff From Grover Park Consulting

Everybody likes Free Stuff.

I have just published one more free application on my website, GPCData Downloads.

This one is an Access Web App (AWA). It's a demo version of the same AWA I use myself.

I track:
  • A1C (Quarterly)
  • Blood Glucose (daily)
  • Blood Pressure (daily)
  • Pulse (daily)
  • Weight (daily)
You can use these same categories, or add your own.

I also track:
  • Daily Exercise by type and location
  • Duration of activities
  • Mileage, when appropriate
In addition, it includes tracking of
  • Medications
  • Medications Taken
The other function included is tracking of:
  • Calories consumed
  • Carbohydrates consumed
I have added a basic "dashboard" function to the basic testing to show max and min values for tests along with average for some, where appropriate. These are color-coded.

Please feel free to download and deploy this AWA to your own Office 365 Site. You can obtain an O365 site for as little as $5.00/month.

BONUS!

Because the data is stored in a SQL Azure database, you can link to it for reporting from Microsoft PowerBI. Create a nice-looking dashboard to share with your own medical team from their browser! 


Monday, November 7, 2016

Microsoft MVP Summit 2016 -- The Future is Brighter Than Ever

At the annual Microsoft MVP Summit in Redmond, getting recharged by the A-Team. We're under NDA for all content; however, it's no secret that the general atmosphere, mood and especially the outlook for our favorite lovable little Database Application are so different from what we're used to that I can't help smiling all the way through the sessions.

Access is not just alive; it's growing again.

Saturday, October 29, 2016

What People are Downloading From GPCData.com

I recently looked up statistics on which of my free demos and downloads have been most frequently downloaded from the GPC Data Downloads Page over the last ten years.

The winner, by a fairly small margin, was the Access 2007 version of  WorkTrack, which is a small tool I originally built (using Access 2003 at the time) to track my own personal project time and billing. When I add in the A2010-A2016 versions and the old A2003 version, it leaps out in front by a significant margin, almost two to one over the next most often downloaded demo, ContactManager. These versions are in first, third and seventh place overall by number of downloads in the ten years of data available. Contact Manager is in second place overall.

It seems people are interested in simple, easy to implement and manage project tracking, and also the ability to track their contacts -- with Access. It does make sense, when you think about it. Anything else we do is fairly custom for the type of business we're in. Most of us, though, want to keep track of who we call, email and meet with, along with the work we do.

So,  with that in mind, I recently overhauled the WorkTrack demo. I cleaned up some code, implemented better error handling, and added a Change History function. The Change History function records all new and modified data in all forms, by user who made the change and by date and time of the change. It doesn't record deletions, which is in hindsight an oversight. I'll get back to that later.

So, if you haven't visited GPCData.com lately, please come on over and kick the tires.

And please let me know what you think. There are feedback links on most pages.

Sunday, October 23, 2016

Shopping For Shades

The future (of Access) is so bright I'm going to need them.

Although the Access Web App (AWA) has lost a bit of its luster, there are new contenders on the rise. Specifically, Power BI and Power Apps. Or should that be PowerBI and PowerApps. I guess that little detail will be clarified as we move along.

First up, Power BI. I had known about it for a while, but I got a real solid introduction at the recently completed Seattle Give Camp 2016. If there is Give Camp weekend in a city near you, I'd recommend you get involved. It's a great chance to give back to the community by helping a local non-profit with a software solution. But for me, an even bigger boost is rubbing shoulders with the young, smart, diverse volunteers who show up and put forth an incredible effort.

Harit Kalathia, one of those young, smart developers provided a Power BI dashboard for our non-profit's Access database. His Power BI dashboard shows all of their Key Performance Indicators (KPI's) on a highly visual, interactive layout. It really opened my eyes as to what can be done with Power BI, but even more than that, how to make it happen. I was hooked.

Sign up for a free Power BI account and go for it yourself.

Being an old-timey Access guy, it took a bit to get used to the new environment, but I found it reasonably easy to figure out, especially since I had Harit's dashboard as a template to guide me. I soon had a two-page multiple graphic dashboard for one of my own Access Web Apps. That's right. I was able to connect directly to the SQL Azure database behind my Diabetes Tracker AWA and pull the data from it into a series of graphics and tables that summarize nicely the important metrics I track: Fasting Blood Glucose, Blood Pressure and Weight.

You may be amused to note that I track Blood Pressure with both feet flat on the floor and with my legs crossed. I discovered that there can be as much as 10 points difference between the two. Most medical practitioners measure your Blood Pressure with your feet flat. I assume they consider that the more accurate way to measure it. It's part of my obsession with data that I track both, I guess.

More than that, Power BI let me publish the result to a URL that I sent to my Primary Care Physician. He was able to launch the Power BI in his own browser and see up-to-the minute metrics that he normally measures every three months in his office.

Did I mention? I am hooked on Power BI.

PowerApps are the other new entrant into the arena of web-enabled tools coming online. It's a newer, less mature technology, and there's a long way to go before I will be ready to call them a reasonable alternative to anything currently possible with Access or Access Web Apps. However, they do solve one big problem with AWAs: the lack of a decent smart device presentation. I've long been a fan of AWAs. As you might know, I helped write a book on the subject. However, the presentation of any AWA on a smart phone is inadequate for all but the most simplistic of functions. PowerApps are designed out of the box to do that.

I won't try to promote them until I see enough progress to make that a reasonable thing to do. In the meantime, I urge you to become informed, try them out if possible, and prepare for the future.

And, as ZZ Top would agree, get yourself a decent pair of Cheap Sunglasses.

Saturday, September 17, 2016

A Field By Any Other Name is Just As Good

It's clear that Microsoft has truly renewed their commitment to our lovable little friend, MS Access. They have launched a series of Tutorials aimed at introducing MS Access to a generation that has little or no idea what Access is and how it works.

Outstanding.

The initial tutorial, however, set off a small controversy among some of my peers; some of us felt it was a bit lax in modeling good design practices. After a brief discussion, though, it became clear that the overall goal is more important than some minor points--points that can be debated anyway.

If it brings more developers to the table, and if they taste the flavor of our lovable little friend's power, and go on to adopt it as a tool in their own toolkits, that can only be seen as a Good Thing.

So, despite my initial misgivings, I am sharing the link here. I hope you do as well. The more the merrier.

Wednesday, September 7, 2016

Proof of Life

The A-Team (i.e. the Access development team) at Microsoft have done something unprecedented, I believe.

They've started putting their plans for MS Access on the Office 365 Roadmap. You can read more about it all on Anders Ebro's Blog here: Smiley Coder's Review of the Office 365 Roadmap for ACCESS.

You can find the Roadmap itself here: Roadmap for Future Access Enhancements

Note that the two items mentioned here have been the subject of considerable discussion among the Microsoft Access MVP Group for a long time, and the fact that The A-Team is moving ahead on them sends a powerful message to the many fans of our lovable little database tool.


The inclusion of the BigInt data type increases the compatibility between Access and modern versions of SQL Server. That's a good thing from any perspective.
Restoring the ability to link Access to .dbf format files addresses a HUGE pain point for a couple of industries. Most of us probably shrugged our shoulders when it was deprecated in Access 2013, but to those people who NEED this ability, it was a deal-breaker.


I couldn't be more delighted to see a public announcement... Well, at least not in the arena of software development.

Tuesday, August 30, 2016

It's Alive!

With a grateful nod towards the late, great Gene Wilder in his classic role in Young Frankenstein, I have the pleasure of noting the stirrings of life in our favorite little database tool, Microsoft Access.

What I hope and expect will be the first release in a trickle--dare we hope a stream or even a wave--is now available.


 

Wednesday, August 17, 2016

Duplicating AutoNumbers — Sort Of

One of the shibboleths of Access design is that Autonumbers make good surrogate Primary Keys because a new one is always generated whenever a new record is started in a table. They are, therefore, guaranteed to be unique, or so goes the belief.

Moreover, because their New Value property is by default set to increment, i.e. to be sequential, they are also counted on to increment in an unbroken pattern, one at a time. That's generally what happens.


Unfortunately, as any long term Access developer can tell you, there isn't any such guarantee in the real world. There are at least two ways in which one or the other of these confidences can be betrayed.
  1. Autonumbers can be generated with gaps in the sequence
  2. Somewhat frighteningly, they can be duplicated.


Gaps appearing in an AutoNumber sequence is familiar to all Access users. It happens either when an existing record is deleted or when a new record is started but abandoned before it is saved.  Because Access is designed to generate new, previously unissued values whenever a new record is started, and because it drops those values when the record is later deleted or not saved, any such values are lost, creating gaps in the sequence. Other than the discomfort some users feel when they see such gaps, it's generally a case of "No harm, No foul." Although there's no universal truth here, most developers feel users should never even SEE the AutoNumber Primary Keys in tables, anyway. What they can't see can't hurt them.


The second way is not unfamiliar, although fortunately quite a lot less common: corruption. Corruption occurs for several reasons. Although this article is getting a bit long in the tooth, Allen Browne has published one of the most comprehensive discussions of the topic I know about. It's not the main focus of this discussion, so I'll leave it to you to you to follow up, should you be interested in learning more about corruption of the database type.

Suffice it to say that Access can abandon corrupt records--including their AutoNumber values--when you do a Compact & Repair on that database. Both the record itself, and that particular AutoNumber value, are lost. The lost AutoNumber value is in itself, not a big problem. Just like with lost or manually deleted records, the loss of the AutoNumber should result in no harm. The loss of the record, of course, can be very harmful, especially in the absence of a backup of that record.

As an aside, there are documented cases of such corruption being persisted through several generations of backups, so that recovering a good copy is no longer possible. That's another story for another day.


There is still another way, though, in which AutoNumbering can go very, very wrong, and that's really what I mostly want to talk about today.

This problem came to light in a recent incident where a friend asked me to lend a hand in a project. He was leaving on vacation and didn't think he was going to be able to wrap it up in time. 

The database in question was split, but a couple dozen users were all sharing a single mdb Front End. This is as close to a universal truth as we get in the Access world:

Sharing a single FE file across a LAN is bad juju.

Shared front end mdbs and accdb files corrupt, sooner or later. Shared back ends can also corrupt, but fortunately that's less common.

We rescued them, doing a compact and repair on the back end and copying the front end to each user's desktop. That took all morning, walking from workstation to workstation, booting the user off while we copied and set up the file.

Then, we had the owner check. Sure enough one record was gone. ONLY ONE! Cool beans. All we need to do is reenter that one record and make sure all of its child records are updated to the new Primary Key. Not so fast, bubba. Whoever designed this puppy decided to use this very same AutoNumber PK for their Work Order Numbers. It's even printed on the certification documents they send to customers. Now, that right there is a problem. It's the main reason why experienced Access  Developers don't show users Autonumbers for any reason. Just imagine. If we had another field in that table called "WorkOrderNumber" with a separate AutoNumber Primary Key field, it would have been a matter of mere seconds to add a new work order, manually give it the missing Work Order Number and then update those child tables to match. Well, no more than a couple of minutes, anyway. But the original developer didn' t do that. He used that doggone AutoNumber.

So what is that message going to sound like? "Um, sorry, but we lost your data and that certificate is no longer viable. Here's a new one. This one is DEFINITELY good. Trust us."

Doable but very, very undesirable.

So, I very cleverly came back the next workday to reinsert that lost record. Here's an interesting, and probably not widely known fact about Access Autonumbers: you CAN manually insert one specific value into an AutoNumber field. Of course, you can't duplicate an existing value in a Primary Key field, but, it's not a problem to reinsert one that was lost due to deletion or corruption. To quote one of my favorite tv shows from the 60's--Surprise, Surprise. That is not the only problem to look out for. Here's what happens, where I went into the ditch and the simple way I was able to recover, but not without significant loss of face in the process.

When Access looks for the next AutoNumber value to insert into a new record, it DOES NOT USE the largest existing value to calculate the next one. Access DOES USE the most recently entered value to calculate the next one.

Oopsie.

When I reinserted that lost AutoNumber value to restore the Work Order Number, it reset that starting value. Unfortunately for all concerned, a lot of people were already in the database, busily entering new work orders. So, rather than having the owner boot everyone one so I could do a compact and repair on it, I let it go, reasoning, incorrectly, that we'd just pick up where we left off.

The trap was set. There were already several new records with higher AutoNumber values which had been entered between the time I started and the time I finished, no more than 10 or 15 minutes. The next two or three new records were able to reuse an incremented AutoNumber values previously lost due to deleted or abandoned records, but as soon as it tried to generate one that WAS in use, it blew up.

Well, not literally, but it might as well have. It simply raised the error that Access couldn't add that new record due to a conflict with an existing Primary Key value. Bad juju.

So, more recovery work, with the correlative loss of confidence. And this time, when the data was recovered, I DID have the owner boot everyone out and do a Compact & Repair on the Back End. Testing demonstrated that it was now using, correctly, the last entered value as the seed for the next available value, based not on the reinserted value, but on the last new value generated by entering a new work order (I hope that makes sense). We burned two or three new Autonumbers testing to be sure, which must cause heart-burn to their people who look for gaps in Work Order Numbers on printed certificates, but that one's treatable with anti-acids, and not a problem for customer relations.

All said and done? I would have been ahead to tell them to recreate that particular work order, with a new Work Order Number and regenerate the certificate to be mailed to their customer.

Either that, or do the job right and finish up every significant database change with a Compact & Repair.

It's just that simple, I guess, when you stop to think about it. Do a Compact & Repair and do not call me in the morning.



Thursday, August 4, 2016

The Lost Realm of Best Practices

First the good news.


Today's developer community seems to have gotten the bug to search out "Best Practices". Probably they try to implement them, but that's beyond my ability to assess, so let's call that one my "Best Guesstimate on Standard Practices".


Now the bad news.


The Realm of Best Practices is in the same place you will find Nirvana, the Lost Realm of Cardolan, Álfheimr, Asphodel Meadows and a nice little cottage with a small dock by the lake.


It's possibly worth the effort to look for it if you happen to be searching for enlightenment, fighting orcs and demons, or even just waiting out eternity. Just don't give up your day job while searching for it.


Sometimes, just good enough is, well, just good enough. And all we really need to do our jobs well, I submit, is to make things work out as good as we can.


If there's a better way to parse a text string than bog-standard:


 Split([ImportField1], ",")


well, then, so be it. I'm not going to waste my time fretting about it. If you want to share YOUR version, I'll listen attentively and maybe even adopt it next time.


If I can get the same results with an In() clauses as I can with Or'ed parameters, I'm not going to stop and ask BingGoogle for advice. Of course, with that one, it's probably not quite so simple as all that. Sometimes an In() clause isn't very efficient. On the other hand, if my code with an  In() clause works, doesn't drag like a flat tire on an overloaded hay wagon, and gets the right results, I'm good enough with that. I have a handy alternative or two in my tool kit, if need be. I'm just not going to burn my time and my client's money looking for the magic sauce of a Best Practice. Note I said "magic" not "secret". And yes, that's about how I see it. Your metaphor may vary.

Sunday, June 26, 2016

Does Anybody Really Know What Time It Is?

Yes, me. I do. I know what time it is.

First, though, you have to tell me where you are and what the date is. Then I can tell you what the local time is where you are.

Simple, eh?

Yes, and No. Obviously, we do this all the time without really thinking about it all that much. If it's 10:02 PM in Seattle in June, then it must be 1:02 AM the next day in New York.

With Access Web Apps, unfortunately, it's not that simple. When Access Web Apps were launched, they weren't fully gestated. Lots of good things went into AWAs, but a lot of important stuff got left out.

More importantly, the basic technology behind the AWA made it impossible to NATIVELY figure out what time it is within the AWA itself. We're talking about server technology which runs in a browser. As far as the AWA is concerned, the current time is the time on the database server where the SQL Azure--or SQL Server--database resides. If the server is somewhere in central Washington state, then Now() returns the current local time in the Pacific time zone. Washington state observes Daylight Saving time. If you ask for Now() in July, you get PDT, but if you ask for Now() in December, you get PST.

Let's say, for example, you happen to be in Arizona when you log in to this AWA. Arizona is one of a handful of states that don't observe Daylight saving time, so it'll be one hour ahead of PST, but it'll be the same hour as PDT.

And, of course, if you happen to be in New Jersey, Now() on this same database server will tell you it's three hours earlier than it really is, because of where your browser is physically located in relation to the database server.

It's enough to make a serious developer walk away from AWAs. I've actually talked to other Access developers who did just that.

Me? I'm stubborn. Besides, I figured, how hard could it be?

Not all that hard, as it turned out.

You can quite readily calculate the current local time for any location in the U. S. If you work at it, you can expand that to other countries as well.

Here's how I did it in one AWA to show how it works. It's based on a list of addresses for various locations throughout the United States. I started with family addresses, but the demo is now a generic set of locations,

First, I imported a new table of ZIP Codes which also includes the UTC offset for that ZIP code and whether that ZIP Code is in a state that observes Daylight saving time. I got that off the internet.

A word of caution. I am well aware that there is no guarantee this approach will handle every possible case, even within the U. S., where each ZIP code does fall entirely within a state. I am aware of one case, out in West Texas, where a ZIP code might be in two different time zones. There are two West Texas counties in the Mountain Time zone, while the remainder of the state is in the Central Time Zone. Therefore, it depends on whether the relevant ZIP codes in those counties fall entirely within their respective county lines. I don't know for sure, but I'm sure local residents do. I'll leave it up to them to work out the details. With that caveat, then, the table provides two of the three pieces of data we need to do the time zone offset calculation.

Next, I created a table of the dates and times when Daylight saving time goes into effect each year, and when it ends each year. This allows me to compare the current date to the date when DST changes and decide whether to apply it or not. This table will need to be maintained for future years, of course. I didn't have sufficient confidence it will be constant for the foreseeable future, although we should be able to count on the next three or four years.

The next step was to update an app table to set flags for this calculation. My app tables have other flags as needed in other operations. It's a handy way to persist such things, but also make them updateable when needed.

Then, I created a Data Macro to perform the DST calculation,

It runs in the On Start Macro. It sets a flag  in the app table to indicate whether DST is in effect or not. Later calculations refer to that flag field in the app table.

A second caution. Note that this flag field is the same for everyone using the AWA. If one user logs in, they set the flag. If someone else logs in a little later, they reset it. Except for the wee morning hours on the days when DST changes, though, this really shouldn't matter. And even then, it's not likely to present a serious hurdle because what matters is that the CURRENT local time is calculated as needed.

Next, I created a set of queries to combine the local addresses in the database with the Time zone offsets in the ZIP Code table and the DST In Effect flag in the app table. The queries report current local time at each address, or for all of the ZIP codes in the ZIP code table.

A set of Data Macros calculates the current time from:

a) The time zone offset reported by the app. This is, as previously indicated, determined by the location of the database server hosting the AWA.

b) The time zone offset for the zip code at each address in the table of addresses stored in the AWA.

c) The DST In Effect flag set when the AWA first starts.

The output of the Data Macros can be used in display current local time next to an address on a view.

It's actually somewhat simpler than it might sound. Please go to my website and download the demo if you want to install it on your own Office 365 site, or on your own on-premises SharePoint site. I'd really appreciate your feedback on  it.


Friday, June 24, 2016

Just Say "No-Rmalization"


It never ceases to amaze me.


The other day, we got a question over at UtterAccess about adding a Total column to an existing query.

In chasing that one down the rabbit hole, we finally learned that there were not one, but two, different tables involved, one for each Fiscal Year on which reporting was required. That's a basic table design problem right there, but it got worse. The earlier table, named FY16 had three fields, Department, Month and "Amount". (I never did learn what that Amount represented, but I suspect a Budget, since I've seen and built FY Budget tables before.) The later table had 14 fields, Deparment, "Amount", "January", "February", etc.


Altogether now, a collective groan so we can move on.


Okay, feel better?


So the original question was how to add that "Totals" column to a query representing the monthly amounts by department. We also didn't get an explanation of how they intended to combine the two FY tables, but, hey, that's the next step.


I pointed out, of course, that this represents two different design flaws, and that getting the required "totals" would be duck soup, or the equivalent in chickens, if the underlying table flaws were fixed.

I also refrained from preaching the gospel of normalization. I've come to the conclusion that too many people don't want to hear it. It DOES sound preachy to say "You've violated two of the Rules of Normalization. Repent!"


No, I try to point out a "more flexible, expandable way" to design and build tables and hope they take it onboard.


Sooner or later, they get it or they give up. Either way, the world is happier place, IMNSHO.

Just Say "No-Rmalization"


It never ceases to amaze me.


The other day, we got a question over at UtterAccess about adding a Total column to an existing query.

In chasing that one down the rabbit hole, we finally learned that there were not one, but two, different tables involved, one for each Fiscal Year on which reporting was required. That's a basic table design problem right there, but it got worse. The earlier table, named FY16 had three fields, Department, Month and "Amount". (I never did learn what that Amount represented, but I suspect a Budget, since I've seen and built FY Budget tables before.) The later table had 14 fields, Deparment, "Amount", "January", "February", etc.


Altogether now, a collective groan so we can move on.


Okay, feel better?


So the original question was how to add that "Totals" column to a query representing the monthly amounts by department. We also didn't get an explanation of how they intended to combine the two FY tables, but, hey, that's the next step.


I pointed out, of course, that this represents two different design flaws, and that getting the required "totals" would be duck soup, or the equivalent in chickens, if the underlying table flaws were fixed.
I also refrained from preaching the gospel of normalization. I've come to the conclusion that too many people don't want to hear it. It DOES sound preachy to say "You've violated two of the Rules of Normalization. Repent!"


No, I try to point out a "more flexible, expandable way" to design and build tables and hope they take it onboard.


Sooner or later, they get it or they give up. Either way, the world is happier place, IMNSHO.

Wednesday, June 15, 2016

What Walks Like a Duck?

 
 I love MS Access. It's been very good to me. I’ve made a living using (primarily) Access for more than a dozen years. There’s one thing about it, though, that keeps me up at night. It’s too flexible.
 
You know the old saying, “If it walks like a duck and quacks like a duck, it’s a duck”. Well, that’s fine if you have a flock of ducks and you’re trying to pick out one chicken among them. Unfortunately, “an Access database” isn’t a Duck, or a Chicken. Or a Goose, Turkey, Partridge, Sage Grouse, Pheasant, Guinea fowl, Cornish hen or Dove. It’s all of those and much, much more.
 
My point? Oh, yeah. Sorry I got a bit hungry and went for a snack. I’m back and ready to state the point.
MS Access is a Relational Database Management System (RDBMS)*. Access is an interface design tool. Access is a Rapid Application Development (RAD) tool. Access is all of those and much, much more. It’s a Data Aggregation Tool; it’s an Extract, Transfer and Load (ETL) tool; it’s a prototyping tool. It’s even got secret sauce and a close, personal relationship (relational pun intended) with its big brother, SQL Server and its other brother, Excel. Oops, got carried away there. There’s no secret sauce.
MS Access has the ACE data engine. It’s a remarkable engine, the little engine that could. You can abuse it, confuse it, rearrange it and extend it. One thing for sure is that, if you can dream it, you can probably build it with ACE. The ACE data engine is almost too flexible and too easily manipulated. And that’s a big problem, from time to time, especially for novices.

Unlike Ray Kinsella in the Field of Dreams, Eli Whitney or Henry Ford, most new Access users come to the task with no background in the field, no experience designing and building a prototype, or a vision and burning passion to realize their dreams to make the world a better place through the power of their creation. They just have a job to do, and most often, little time to do it.
That means, unfortunately, they all too often don’t have an inclination to invest the time and effort into figuring out how to use the ACE data engine most effectively. It’s sort of like Ray Kinsella setting out to clear his corn field with a garden hoe and rake, or Henry Ford trying to build cars on an assembly line with no way to move the vehicles along except teams of horses hitched to each one. Possible? Sure. Practical? Not really.

While I do applaud the creative, inventive use of Access, I have to wonder if enough novices really grasp the significance of learning how to do it right. And by right, I mean the five Rules of Normalization. Actually, there are many, many discussions of normalization floating around on the internet. If you want to know more, just ask Bingoogle. They both know where to look.
I guess, to sum it up, I’m trying to say that all ducks are not born equal. Some are Chickens, some Geese, some Pheasants, and a few I’ve seen have been real Turkeys. Applying the rules of normalization is the best way to avoid those turkeys.
 
 
 
*I’ll get some argument on this point. Access ACE is not a full-fledged RDBMS and was never intended to be. I know, I’m glossing over that fact on my way to a larger point. Take it up with me in an email; I’ll be happy to hear your opinions on the issue. I'll probably share them too. Depends on how many nasty words you use--or don't use, if you get my meaning.
 
 
 
 
 
 
 
 



[1] I’ll get some argument on this point. Access ACE is not a full-fledged RDBMS and was never intended to be. I know, I’m glossing over that fact on my way to a larger point. Take it up with me in an email; I’ll be happy to hear your opinions on the issue.

Wednesday, June 8, 2016

Back to Earth for a Moment.

Recently, I’ve spent a lot of time thinking about, and working with, Access Web Apps, which is Microsoft’s tentative venture into making MS Access databases work “in the cloud”. I know, pretty sexy, huh? The Cloud: that’s the ticket. (Bow in the general direction of Jon Lovitz.)
 
Well, it’s not all that.
AWAs followed on from the idea that MS Access is the best small database solution available for Windows, and has been for over 20 years. So, moving it to the cloud was a no-brainer, at least in the beginning 6 years ago. Microsoft’s Access web database offerings turned out to be less than we hoped they would be, and still could be. We’re not done yet, so there’s still hope, but that hope has definitely faded with each passing month. However, that’s only part of my point. The rest of my thoughts have to do with how we think about, how we approach, what we do for, our clients. We’re Access developers. What we do is create Access database solutions, right? Well, not so fast.
 
While we may be tempted to think in terms of "a database solution", what we are really after, most of the time, is a "data management solution". Clients come looking for our help to solve problems within their organizations which usually, but not always, include managing some of their most important business-related data. They need to make processes more efficient, minimize transaction time and costs, and produce more actionable reports. Those are the things we must put front and center, not the specific tools we use to solve those problems. That means the specific tools we choose should not be the determining factor in how we do the job.
 
I'd say at least two other factors must be given higher priority than our choice of a specific tool:
 
Simplicity. A variation of Occam's Razor can be applied. When there are competing ways to get the job done, it's usually best to choose the simplest one. Not always, of course. There may be other complications which require more elaborate solutions. However, the end user's needs should be met with the least complex, but still effective, solution.
 
Cost. We must never lose sight of the fact that we, as developers, are expending SOMEONE's resources no matter what solution we implement. If a sexy solution offers us the chance to show off our chops, and it costs no more than a mundane solution that would get the same result, okay fine I say, show off. You don’t get that many chances to dance in front of an audience. Just remember, the person with the checkbook is paying for results, not a flamboyant Flamenco.
 
On the other hand, it's not likely to be in the client's best interest to invest two or three times the amount in a feel-good solution using the latest bleeding edge technology just because it can be done that way. Why go to the cloud? Is that what meets the minimum acceptable requirements of the project? If so, then do it. If not, resist the urge to try out something new, just so you can add it to your resume. I have yet to meet the client who told me, “Make sure you have a good time working on my project. I don’t really mind what it costs.”
 
I have had one client tell me, on the other hand, “If what you want to do will save me money, or make me more money, go ahead and do it.”
 
That’s been my motto for more than a dozen years now. I approach every new project with the same thing in mind. You should too.

Wednesday, April 27, 2016

Office 365 Plans For Access Web Apps

If you want to create an Access Web App, or AWA, you need two components:
 
1)      The MS Access Application, version 2013 or 2016
2)      An Office 365 Account which includes Access Services
 
Of course, if you work in a large organization which has deployed SharePoint 2013 or 2016 internally, you can use the Access Services provisioned on it (assuming that has been done). Many of us don't have that option. We can, however, get a cheap O365 account for our purposes. And that’s what I am going to talk about today.
 
Obviously, to create an ACCESS database or web app, you need MS Access. I think most people get that point without a lot of effort. Further, it’s pretty easy to understand that you need either the 2013 or 2016 version to support AWA’s. We’re used to versioning in software applications.  
 
It's the second component that gets murky, which Office 365 Account do you need and why?
 

Business Plans

Here’s the page for  Office 365 Business Plans. Microsoft’s website listing the basic plan options for Business. It used to be called “Small Business Plans”, it’s now just “Business Plans”, as opposed to "Enterprise Plans" or "Personal Plans".

Enterprise Plans

There’s also a page for Office 365 Enterprise Plans. I won’t go there today, but if you need the services and products offered in any of those plans, the higher costs for them are worthwhile.
 

Detailed Comparison of Plans

 Here's a page with detailed comparisons of the plans.
 

What Do You Get in an Office 365 Plan?

So, the first thing we see for Office 365 Business Plans is that two of the three plans include the “Standard” Office applications, which means Word, Excel, PowerPoint, Outlook and OneNote. None of them include the Access application itself. This will be the 2016 version of those applications; I believe you can opt for the 2013 version, but I also don’t understand why anyone would do that.
 
The absence of MS Access means, however, means you have to obtain a license for MS Access, the application, elsewhere. Regardless of which Business Plan you choose, you don’t get the Access Application.

 

Getting Access Services

 
The other component we’re looking for, Access Services, on the other hand, IS included in all three of these plans.
 
Good News! If you have a licensed copy of MS Access 2013 or MS Access 2016, all you need is the lowest cost plan Office 365 Business Essentials at $6.00 per month, or $5.00 per month if you buy an annual license.
 
This is important information for anyone looking to move their Access databases “into the cloud”. For $60.00 a year (annually) or $72.00 a year (monthly), you can have any number of Access Web Apps on the web. In my opinion, that is a very good deal.
Pass it on.
Thanks.
 
 

Wednesday, April 20, 2016

Where Do YOU Shop for Furniture?


The other day I was reading posts on www.UtterAccess and thinking about what kinds of questions people ask. It occurred to me that some people look for database design assistance as if they are Ikea shoppers, some like Ethan Allen shoppers, and some like Home Depot shoppers.

The Ikea shoppers always want a database template. They’re willing to do a minor amount of assembly, but they crave pre-packaged solutions that someone else put all the hard work into creating, packaging and delivering.

The Ethan Allen shoppers want a high-end, finished database that they can have delivered to their home or office. They’re not interested in learning much about it, only that it is delivered with a minimum of hassle on their part. They’re willing to pay the cost which goes with that product.

The Home Depot shoppers are looking for an associate in the warehouse store who can help them pick out materials, along with few tools, and maybe conduct a training session on Saturday morning, but they expect to do the hard work themselves. They know the final product is going to be a bit rough around the edges, but that’s a good trade-off for saving a lot of money.

One problem, of course, is when shoppers expect to pay Ikea or Home Depot prices for Ethan Allen products. That’s a bit frustrating, as a matter of fact. I can’t say that I really blame them, I suppose. If you could get a $3,000 sofa for $198, wouldn’t you take it?

Another problem is when shoppers want to pick up an Ikea bookcase that exactly fits that odd-shaped corner of their living room. Well, chances are high that anything you get from Ikea is going to have standard dimensions, aimed at 99% of their customers, and not at your odd-ball space. If you put your mind to it, one of those out-of-the-box bookshelves can be modified to work for you. You just need to be willing to put in the work to do it.

And the third problem is that Home Depot shoppers tend to run into frequent difficulties that keep them returning to the store for more parts, more tools and more tutorials. There’s a lot more hand-holding involved.

Actually, now that I think about it, none of these are really problems, with a capital P. They’re just different ways to think about what it means to be a support person in the wonderful world of online forums.

 

 

Friday, April 8, 2016

What Are People Looking For?

My website, www.gpcdata.com, has been active for years. I offer a lot of free, fully functional, sample Access databases and code examples. Recently, I was reviewing statistics on what people look for when finding me, and what they download when they get there.
 
As you'd guess, "Free Access Database" is a popular search term, usually accompanied with a specific version or category. Students seem to be popular. Lots of school staff and teachers out there looking for a way to track their classes or schools.
 
However, the most popular download, by a narrow margin, is "contacts". The version I have available is mostly aimed at tracking simple, client/contact related information, along with meetings and phone calls with those contacts.
 
Currently, there are two versions. The newest one is an accdb designed with Access 2013. It ought to run in Access 2010 and even 2007, although I haven't checked the latter recently. The other is the venerable mdb originally designed with Access 2003. I debated pulling the mdb out of the download, but after some recent conversations on www.UtterAccess.com, I decided to let it ride. Lots of people seem to be sticking with the older versions still, even Access 97 despite it's being two decades old.
 
In second place is an older version of my project and work tracking tool. I actually built it for my own use and have made many modifications to it over the years. The version I use every day is now modified to run under Access 2013/2016, with a custom Ribbon. It connects to SQL Azure tables now. That allows me to carry a copy of the Front End on my laptop and update work hours from any location without having to worry about resynching Access BEs.
 
Actually, there are three versions of Working Tracking, one for Access 2007, one for 2010 and newer, and one for 2003 (the mdb format). Interestingly enough, the 2007 version has proven slightly more popular over the years and still is in the most recent reporting period.
 
I'm mulling over the implications of this history of contacts and work tracking.
 
What do you think?

Wednesday, March 30, 2016

One Size Fits All, Or All For One. NOT!


One of the most valuable aspects of Access Database development is that it is almost entirely custom.
That means nearly every new Access application is different in some non-trivial way from every one that preceded it--ever.

Need a database to track family addresses and phone numbers? Great. You'll find templates all over for that. But no two will be exactly alike and that's a good thing, IMO.

However, for a lot of new Access developers, that turns out to be a stumbling block to getting off the ground. If all you want to do is copy out a template and go to work, it's almost never possible to do so. Naming conventions, at the very least, are those chosen by the template designer. They may or may not be the same as those used in your organization.

And that’s just the trivial part.

You want to track one or more phone numbers for each contact. No problem, that’s a job for a related table of Phone Numbers. But what if the template maker decided every contact can have three Phone Numbers (Home, Work and Mobile)? That’s a limitation you’ll have to accept as a compromise. Or, you can modify the template.

At some point, such trade-offs and compromises become more of a hindrance to getting on with the task than simply starting out from scratch with your own design. And the more complex the business process, the more likely it will be that any template you find won’t stretch to fit it.

All of that can be summed up in the saying which is the title of this post. One Size Does Not Fit All.

You’ve been warned.

Wednesday, March 16, 2016

It's all about the data.


That's so obvious it seems silly to even comment on it in a page that's all about database design, doesn't it? But, every day, while addressing Access questions on my favorite Access forum, Utter Access, I run across questions can only be answered by looking at the specific data involved, not at the code used to manipulate it.

A recent example might help explain what I’m talking about. I’ve rephrased the question so as to avoid making it too easy to identify the source.

“My query raises a division by zero error. The query includes two calculated fields. The data is from three subqueries. The SQL for the final is shown below. How do I avoid the division by zero issue?”
Not picking on anyone, but the basic mathematician in me says “this ain’t a database question, it’s a math question. You avoid division by zero by not including zeros in the divisor.” But to the questioner, that simply hadn’t occurred, I guess. He or she was looking for “an Access solution” to the math problem.

There are, of course, two answers to this.

First, if records in the underlying tables have either Nulls or zeroes in one of the fields going into the calculation, then those values have to be resolved before you even START writing queries against that data. Exclude those records from the selection before you try to do any math on the remaining records.

There is, also in that sense, “an Access solution” to the question, "How do you handle Nulls and Zeroes in SQL so that they don’t blow up calculations?" The answers, of course, involve functions to convert Nulls and zeroes, as needed, when they appear. But that’s not the point of today’s comments.

Second, there is an even earlier, more fundamental, question to be addressed. If you are doing math (and a division by zero  error definitely signals that math is being attempted), then you have to decide whether Null is valid for the data at hand, and whether Zero is valid for the data at hand. If so, then why do you then want to do math on either of those values, knowing that your calculation is not valid under any circumstances?

For example, if you want to calculate the average number of days between the date an order is placed and the date it is shipped, you have to decide, right at the beginning, if you want to try to include orders without a ship date. Those orders are still being processed and not yet shipped. I submit the answer to that one is obvious. You must exclude them because it’s only meaningful to ask about average processing days for orders that were actually processed and shipped. So, by the time you write that SQL with a division in it, there are no nulls to fret over.

Another example would have to do with calculating error rates in a manufacturing process. Let’s say you want to calculate a percentage of errors detected in Quality Control versus errors reported by customers after products are delivered. Unfortunately, if the calculation is QA Errors divided by Customer Reported Errors, it’s entirely possible (and one would have to hope, likely) to have one or more errors found in QA and Zero reported by customers for a particular product. See what’s going to happen there? Yup, a divide by Zero error.

The answer to that one is equally obvious, I think. You don’t do the arithmetic that way in the first place. You come up with a more appropriate, mathematically valid, way to calculate this metric. And how you do that is as much a business rule as it is a math problem. Maybe the next approach might be, well, okay, lets SUM the QA and Customer Reported errors first and then divide the QA errors by that total to get a ratio of QA errors. Uh uh! If there are no errors of either sort, then you’re still dividing by Zero.

At this point, I will step in and acknowledge that there is an Access way to handle it, but you can only get there by understanding the math—and the business rule—behind it.

It's all about understanding your data first. After that, it's all just code.