A few weeks ago I had a chance to meet with my good friends at Lambda Test Equipment in Pretoria, South Africa. Lambda is an award winning organization that specializes in test equipment to support fiber optic networks. They have a solid reputation and loyal customer base that they’ve been serving for over two decades. And, of course, they represent OSPInsight.
Along with various customer visits around Pretoria and Johannesburg, I had the opportunity to spend a full morning with many Lambda clients, discussing the importance of implementing software solutions to manage their fiber optic networks. There is a point in such presentations, after explaining the benefits of properly documenting a fiber optic network, that I pause and say something like, “Once you do all of this, you will have wasted your money.” After I say that, I usually let the words float in silence while the participants stare at me like I just spewed some “alternative fact”. And then, I explain.
Investing in a solution to support the management of your fiber optic network is more than just buying software; it requires a commitment to migrate legacy data and maintain the database going forward. If you can’t make that commitment, YOU ARE WASTING YOUR MONEY!
Think of the fiber optic network management system as if it were a beautiful garden. Imagine this garden being one where you’ve spent hours, days, and weeks weeding, planting, watering, and pruning. Imagine how you’d feel on warm summer nights and early summer mornings enjoying the beauty and serenity of that garden. Imagine savoring the sweet taste of fruits and vegetables grown in the garden. It would be vibrant with bright colors and fresh fragrances. Then, imagine what it would look like if you went on vacation for a few months without leaving anyone to care for it. The plants would whither without water under the scalding sun. Weeds would engulf the plants and spread like a virus. The garden would become an eye-sore, a dusty plot of disarray with all of the effort of creation lost and forgotten.
Just as the garden, every database begins its life with hope and aspiration. There is energy and resources dedicated to adding data to the system. As that happens, the data begins to blossom across a network map to appear on computer screens and mobile devices. Entities become enriched with newly available reporting capabilities. Field technicians are empowered with tools to dig deep into the rich supply of information to plan and problem solve. But, as the physical network grows and changes, if the database does not grow and change likewise, the integrity of the data will be compromised. This can result in user skepticism which may trigger a death spiral of the data since the less the data is trusted the less it is used, and the less it is used the less it is trusted. Ultimately the data becomes irrelevant and worthless; a failed project and a wasted investment.
Since our very first experience of creating and maintaining a fiber optic network database in 1996, along with the hundreds of databases since then, we, along with our clients, have experienced just about everything when it comes to maintaining these databases. There have been successes and failures, yet with each we have learned valuable lessons. Based on those lessons I’ve summarized seven essential items listed below that have been proven to help attain and maintain a vibrant fiber optic network database.
1. Know where changes to the physical network are going to be made.
There are various approaches to tracking where the physical work is being done on the network, but the easiest to implement is to use the network map in the fiber optic network documentation system to help you visualize them. For example, place a marker at the locations on the network map where work is being planned or performed. Then, as the as-built drawings come in for those projects, change the marker to indicate that the data has been received and input. This will give you a quick visual of where to expect database changes.
If you use OSPInsight PET to design and manage your projects, you will always know where the work is being done and what is being done. You will also know where work is planned to be done, where it has been done, who did it, and for what cost. Learn more about OSPInsight PET at ospinsightpet.com.
2. Establish an internal owner of the database who has authority to demand updates from field crews and other personnel.
If you are lucky enough to have someone who is passionate about keeping the database updated, put that person in charge. If you don’t have that person, incentivize someone to take ownership. We have found that it is better to have only one person responsible to manage the process. For small networks, this person may also be the one doing the revisions. For large networks, this person may be managing a team of editors. With one point of entry, the data will be less likely to get lost in the shuffle. Furthermore, this person should be empowered to demand as-built updates if they are not being received from the field in a timely manner.
3. Thoroughly train editors of the database to understand exactly how to do their job.
Our philosophy is the fewer editors the better, meaning that there are fewer potential issues when fewer people touch the data. Even so, we work with companies that have scores of editors because their data rollover is so massive. No matter how many editors, each editor needs to be highly trained in the specific tasks they’re assigned. Everyone that has the ability to add, modify, or delete data, no matter how small the job, should understand the tremendous responsibility and privilege they have. It is much easier to do it right the first time than it is to fix errors later.
With OSPInsight, we have invested in online training courses for our products to have them available 24 / 7 / 365. This makes it easier for editors to get the training they need, when they need it.
4. After the data is input, use sample testing to confirm its accuracy.
If a building is being added to the database and it contains a patch panel that has a terminated fiber cable that goes into the outside plant, it is always best to test the ports on that panel to make sure the related optical circuits take the correct path. In OSPInsight, simply run a taper report on that entry cable to make sure the fibers go to the correct end points. However, you can’t always reasonably check 100% of the data being input. Test smart. Sample testing is a good way to help ensure data accuracy when full testing is not possible. Depending on the data being input, determine what the best sample size would be and test accordingly. If errors are found, you will need to increase the sample size until errors are eradicated. Then, reduce the sample size and continue.
5. Provide regular reports to upper management, focusing on metrics of interest that reference the fiber optic network database.
If the contributors to the database know that upper management is using the data they are creating, they will be more sensitive to ensuring the accuracy and promptness of the data entry. For example, if upper management is using a fiber capacity report on a weekly basis to determine the health of the network, they will notice changes to the numbers they evaluate. Imagine if the report indicated there was 75% fiber availability in a given area and then the following week that number went to 50%. That would spawn questions which would require answers. In this way, a feedback loop for the data naturally occurs and has a positive effect on the veracity of the data.
We have created our OSPInsight Reports tool to provide various reports to facilitate this need. Learn more about OSPInsight Reports.
6. Monitor the performance and integrity of the data consistently and often.
Software development teams work tirelessly to find and fix every possible way that data could be adversely impacted by users. But, there always seems to be scenarios that get missed. Thus, it is important to have tools to monitor the mechanics of the data to search for issues that impact the integrity of the data. Such tools become a safety net of sorts for software updates and new users. Without an integrity report, there is always an issue of trust with the data.
Our OSPInsight Integrity application is designed specifically to provide this feedback. Learn more about OSPInsight Integrity.
7. Train and encourage your internal teams to embrace the system and rely on its data.
The healthiest databases are those that are being used. When the data is exercised it becomes self-validated. An integrity tool, as described above, can only look for mechanical issues with the data, it cannot determine if the data was input at the wrong location. Testing, as described above, can help root out such errors, but things can still be missed. Ultimately, if a database is being used its data is being measured against some real world situation on a regular basis. For example, a maintenance technician who accurately tracks down the location of a broken fiber would be validating much of the data contained within that fiber route. Otherwise, the broken fiber would not have been found. We address such training needs at our training website to make it easy for all users to learn how to make the data work for them.
It’s easy to underestimate the effort to properly maintain data. But, with an awareness of that, and some careful planning and execution, prompt and accurate database maintenance can be achieved. In fact, it can be wildly successful. Use the suggestions listed above to get yourself started. If you are having problems with your database, contact us.
Our passion is fiber optic network documentation. We’ve migrated and maintained millions of miles of fiber optic strands across hundreds of databases for organizations around the world. We know what it takes to make and keep your fiber optic network database healthy and vibrant, not only now, but for years to come.