We have been working on a web site upgrade evaluation for a few months. Recently I had the big cheese and another little big cheese show up in my office to discuss a couple things.
Listen – listen to what is being asked for. Talk about it, with them, with yourself and with your colleges. You cannot over talk some of this stuff – since the talking should lead to enlightenment or understanding.
Rather than push some opinion about what could be done, listen, listen critically, understand what they are trying to express.
Specifically – I heard these things
our list-server – push email solution is not the best for sharing information. Rather, we would like a space on the new web site where we could put things, like files, discussions. Rather than a discussion in the email list serve – with file attachments, we should have a space on the website where the people we want (people on the current list serve) could come into to and engage the content there. We would need a forum type activity – that would promote discussion among the group. The space would be password protected, in addition to hiding behind authentication requirement. When content in the space updated, we would want to push out an email to them, like the list-serve – inviting them into the space for discussion or whatever. The concept of a push notification is still important, but only as a means to bring them into the website space. Even include a link in the email to the space.
How then can we add a discussion type activity to a space, where we also can upload files and video links etc.
There is currently a shared space on the existing site – hiding behind a password, but there is no forum or other other medium for discussion.
I also heard one of the chiefs say “we want a consistent use of software – or a familiarity.” They would like a common piece of software to use to improve collaboration. Some of our other customers are already using software a, so software a looks better in our comparison to software b. Software b is good, strong, but unfamiliar. That is software b’s disadvantage. It is unfamiliar. When big cheese says “some of my districts already use it”, that is a big fat influence to the decision to use software a or b.
Before concluding, let me try to brain dump what I heard from both big cheeses.
a shared space for collaboration
the list server is not efficient
is there additional cost if we need to create accounts in the software to allow authentication into shared space? – no was the answer – we have unlimited account creation options.
We want to use a common piece of software.
Moodle is a LMS, BB Wires, Finalsite would be CMS’s. ( leaning -vs- content management systems). Moodle could be the best tool for creating a shared space that included a discussion forum, files, videos etc.
An email group would still be used to notify people when something has changed in the group. A push to a group called say – big_cheeses.
We are a school. Actually, a BOCES – who supports schools. One of the many things we do is provide insights into software. Consider solutions, contrast benefits, tradoffs. etc.
A notification system is a big deal to us at BOCES. We have lots of communication that we have to do. Our web site software has a native notification module as part of its core. But, it does not do the things that are requested of it. This is not an uncommon scenario. Software doing some of what you would like, but not everything you would like, or at least not as well as you would like. Maybe the issue is with how it communicates. Want a message across the front of your website? Want a push notification to your phone? Want a notification send to the local media outlets? How about a Facebook post or a tweet? How about to an email list? How about someone calling your phone?
Part of the recent conversation about our website upgrade was around this. Does SchoolWires by Blackboard have a notification module? Does it do this or that? How about FinalSite? How does their software solution handle notifications? What if you want to notify parents or the community? Or even the students that their lunch money account is low? Lots of needs, lots of scenarios.
Some of our districts use SchoolMessenger. This is from their website
“When it comes to school notification systems, reliability, ease of use, and security are the most important considerations. That’s why leading districts from coast to coast rely on the SchoolMessenger Communicate service for notification. Discover what sets our system apart today.”
I think the website solution we are upgrading to has a pretty good Notification feature. It does not come with the core application, but rather is available as an additional module. This makes sense too, as software gets bigger, trying to do more things, it starts to get a little bloated from a space and memory requirement. Not everything should be included in the software, so things should be available as plugins or additions.
We shall see. School Messenger, Blackboard notifications, or some other 3rd party vendor will offer their respective solution.
Deal with the problem or react to the symptoms. Logic dictates correcting the problem, not the symptoms. All other factors equal, correct the problem. Except…
When dealing with old legacy Access DB, with lots of VBA and old code. That code may have been correct years ago, but the process changes. When the process changes, and not the code – then issues start to seep into the code.
This is the process we follow for most issues that arise.
- I would clarify the problem,
- investigate the code
- test a correction
- verify with my customer. She would test and verify.
in this particular case. I can step back, allow here to use the system, create her bookings and then run an update query on the data. Is this lazy? Sort of. Is this smart? Sort of.
I do not really want to make big changes to this part of the application. This is the part that writes data to a table using VBA. I have not every changed this particular code in the app. The code works, sort of. The problem could be the process my customer is using. She may be doing something out of sequence. Maybe she is supposed to do something in a different order? I don’t really want to get into this. It is not hard for me to react and update the order data using a query.
Rather than changing the system, change the data resulting in the system.This is faulty, but does make some sense.
When my customer says ”
Hope all is well!! I’m having trouble with the Estec code. When I go to the main menu to kits, I’ve increased the costs for next year. When I “transfer” the costs to next year, it doesn’t seem to do it. “
Open the DB
Open the kits table – so you can view the cost field and kit id. (hide other columns)
Open the booking table – so you can see kit id and booking cost (hid other columns)
Create a simple update statement using the cost field from the kits table to update the booking records in the booking table
set booking_cost = (xxx)
where kidid = xx and yearid = xx
Maybe I should look at the code executed on the button clicked …..
This is the official propaganda statement😉.
“Rapid Responder is an All-Hazards Emergency Preparedness and Crisis Management System that securely stores and shares emergency operating procedures and response plans used by facility staff and first responders to effectively prepare for, respond to and recover from any emergency.”
An emergency response plan. All the important information you may need if your location has a security incident. Like a shooter or a hostage taker. Where police and other emergency responders would go when needing information about your school, site, buildings, etc.
Rapid responder is a private company capitalizing on the state of fear that we now live in. RR is a service offered to organizations that would be *soft spots for terrorist.
RR is a service that collects and stores information with the goal of organizing it to assist law enforcement and emergency response personal.
RR brings organization and predictability to an otherwise very stressful situation.
Our administration likes it. RR makes them feel prepared. Prepared is a good feeling. If something terrible did happen, at least we have all the important information in one place, secure behind passwords that the bad guys should not be able to penetrate.
RR works with people in our org, collecting data about things like
- building locations
- building diagrams/schematics
- student information, # of students
- access points into and out of buildings
- rendezvous points
- surrounding areas – woods, housing, water etc.
- roads into and out of area
- safety equipment on site
- egress locations – exits
- utility locations, power, water, electrical
- network infrastructure, phones
- staging locations, for community members
- emergency response team members
I attended one of the initial meetings with RR and the admin. at one of our tech centers. We spend about an hr discussing the building, access points and other information about the location that was then added to our site. Later, admins with permissions added more detailed diagrams and schematics, further enhancing the info. on the site.
The information is not publically available, of course not. You don’t want the bad guys accessing this information. I have credentials for our site as if I am an emergency responder, this is what police, ERs and such would use to look at the info.
I also have administrative credentials to the site – so I could update the information on the site. Upload new pics, schematics, change existing info. etc.
The URL to our site – https://prod.rapidresponder.com/Login.aspx
That is probably the URL to lots of different sites, since I do not see anything specific in the URL.
The URL to the admin. side of the site – https://prodadmin.rapidresponder.com/Login.aspx
The only diff. is the admin after the prod and before the rapidresponder.com site.
A couple integration concepts that failed for us….
1 – Our admin. folks wanted to place an icon on a schematic within RR and upon clicking would automatically show them that camera. I think RR oversold that feature. Our security camera server sites in our network, behind our firewall. Since the RR site does NOT site behind our firewall, in our network, there was no easy clean way to allow access to a camera from outside the network. Genetec, our camera security software, provides a web client for accessing cameras. It requires authentication for access. There are no backdoors into the system. No way to simply enter the software and view a camera without using the front door (authenticating) and then entering a room – building to view its cameras. We talked to the folks at RR about where our site could live, mainly asking them if it could live in our network, IE, could we host the site, there response was something like “uh, no – the sites live in our secure area”.
2 – Next then, was the request to automatically log into the security camera system from within RR. Since the emergency responsers already had to authenticate, then could we simply pass those credentials along when we view the security cameras. Seems like a reasonable request, EXCEPT. We were not able to get that to work either . The camera software engineers from Genetec simply said “passing in login credentials on the URL and having it automatically log in was not a feature of the software“. Users have to come in via the front door and show their ID.
That was 0 for 2 on security camera integration from RR to Genetec.
Instead, we simply have a link(s), on various building schematics in the RR space that load the Genetec web client. My customer did add camera names to the schematics in the RR space and then updated those camera names in the Genetec space – but still not any real integration. This is a slice of one of the schematics that was added, complete with a camera icon, that includes the name of the camera and a link to the web client to view it. The link opens the security camera software in a popup – but not logged in.
Since then, another of our sister organizations are now flirting with the idea of using RR, great. There network guy said this when thinking about the integration between RR and their existing security cameras. They must not used Frontrunner systems to install their cameras on their network – it sounds more like they do not have the concept of a web client for viewing cameras.
I spoke with Don and Terri of bla blas on Friday.
Their Health Safety and Rish Management department is working with the surronding county 911 departments and several of our componets school districts.
We brain-stormed around how first reponders in the region might be able to get access to district security cameras (both IP enabled and closed circuit).
They currently have serveral district in both of our regions that have signed up for the Rapid Respnder servive. One solution that we came up with was to postion a wks.(terminalserver) on the network that 911 and first respnders could RDP into without having to use A VPN client.
We could then, with access-lists lock this workstation down to only allow the particpaiting counties to come into the box remotely. Additionally we would use access-lists to limit the wks. to only the cameras that are requested by the district. In many cases the security camera network is a sepearte vlan which should make this fairly simple.
This seems a better solution than trying to manage multiple VPN accounts that can be shared beyond the original intended user. The wks.(terminal server) would be the check point for login credentials much like we have for our tech staff. But it would have limited abiltiy
to see the entire network.
Actullly we could have GV be respnsible for the server on their network and when the proper requests are made we change the firewall rules to allow them. Much like we do with the shared CBO now.
Any thoughts concerns etc? Is this something that we could go ahead with?
Hmmm, first of all, there are lots of typos, Dave should use spellcheck. His thinking is interesting, I do not understand alot of it, like terminal servers and a separate vlan, but I can deduce that they do not use Genetec, since there is no mention of a web client that is currently used to access their existing security cameras.
More to follow.
A question that many have pondered. Should I put my stuff in the cloud or not? Once upon a time, there was no cloud, except for a corporate network, that is not really the cloud, it lives in your house. I used to use 3.5 inch floppy disks to store my work. Then came the zip drives, 100 mb, wow. Shortly after that the CD, write your stuff to a disk, its 600 mb, wow, wow. Then came along flash drives and bigger external hard drives, well actually smaller with more storage. Over a gig! sometimes 4 GB (remember 1 GB is 1024 MB). Then, thumb or flash drives that were over 16 GB. And now, to the cloud. Its becoming more normal to think “save it to the cloud”. “Let the cloud handle that.” “Trust in the cloud.” Think Google drive or Microsoft bla bla. These are services that allow you to save your stuff somewhere else, you may not even care where. Just save it somewhere and remember where and your credentials to access the “where”.
I am brain shifting now from stuff your create to software programs that allow you to create. Where are they? On a server in your corporate network?, maybe. Or, somewhere in the cloud.
What is the cloud? A bunch of servers with lots of storage capacity. The cloud is a nebulous concept, invisible really. Its not on your flash drive or on a server in your office, down the hall or across the road. Its the great server in the sky. The cloud is a destination. A trust worthy place where you can put your things and not worry about them. A place to put things that matter to your job or business, where you do not have to worry about them or update them, software or servers anyway.
Lots of big companies, like IBM, Google, Apple, Microsoft – they all offer cloud services. Hosting. How about apple and their music cloud. No need to keep all that stuff on one device. Put in up here in the cloud and your can have it on all your devices. That does make some sense – the syncing of devices to the apple cloud is fraught with peril. I have seen it spin people around numerous time.
I like keeping things in the cloud, I always know where stuff is. For anything to be available at any time from any where – it must be accessible. How could something be available from anywhere at any time? First, it must exist and be saved to a cloud server. Then, to access it, there must be a device and internet connection.
If there is no internet connection or the internet has a bad day or the software that is allowing you to access your part of the cloud has a bad day…..then you do not get your stuff. If you had your stuff on a flash drive or saved to the local hard drive of a computer or device, then yes, you could access your stuff.
Companies, when selling cloud services brag about up time or access time or lack of downtime. This is what the 99.9 % up time propaganda is talking about.
Im going to bring it in a little. I support software systems, where i am dealing with these types of cloud or no cloud decisions. I just recently enabled Google Drive on one of the Moodle instances I support. This is an example of allowing one system to access another. GD is the cloud. More people in education are relying on cloud storage. When it comes to word processing programs, use of GD. If the person has a google email, and make use of the GD service, then they are storing things to the cloud, the Google cloud, one of them anyway.
I have another Moodle system I support where we offer credit courses to New York State students. We purchase course material from about 5 different vendors. Some of the vendors load the course content to our Moodle server, while others load it to the cloud and reference it from our Moodle. Both solutions have advantages, similar to cloud -vs- local advantages. When course content is loaded locally, we have complete access to it and can change it more easily. When content is referenced from the cloud, I don’t have to worry about if I have the latest content. Sometimes, when content is loaded locally, it gets outdated when updates are made to a master and not pushed down to our local server.
I received an email from one of the course vendors this am, it says “updates were made to many courses and may need to be updated in client master shells”.. Translated means, they push course content to the local server and may need to update those previously deployed. This has happened a few times over the years, where updated content did not make its way to a deployed course in our Moodle. If the course content is consumed from the cloud and referenced from a local shell, the content would have been *automatically updated.
Its all a little complex, there is not silver bullet or best option that covers all situations. Rather, there are options. The better you understand the options the better service you can provide your customers.
The cloud is the future. Cloud services are the future. Probably dominated by a few giant cloud providers.
I support a few systems that implement similar granular permissions logic. When content managers want to use the system, they must have permissions to do so.
Yesterday, a “content manager” of our security cameras software system called me and said
“James, I am trying to export a piece of video from one of the cameras, but cannot. “
There was an incident on our campus, someone wanted to export a 90 second clip from the previous week. Good. That is why the keep 30 days of video from most of the cameras in our system. I was able to access the camera find the video and export for him. Great.
But, remember a good rule is to empower your users, give them the ability to do what they need in the system. They should not have to rely on you. Let them do their jobs. Great. Except, here comes the conundrum. With systems and permissions, the more liberal you are with them,the more holes you open in your system. In theory, its much better to have fewer accounts with lots of permissions. Its like opening a hole in your firewall. You prefer not to do it. The default position is no. Permissions are granted on a need basis. When we do grant permission, it is only to the area of need, to the chagrin of simple and easy.
That is the balance you try to strike
1 – empower your users, give them permission to do their job – you do not do for them – you are not a bottle neck
2- A granular minimalist approach. No super admin for you.
I solved this issue by adding a very granular permission to a group where the user was a member. And then asked him to login and try to export the video piece again. This solution was a little bit of 1 and 2, perhaps that is the blend to strive for.
- Steps in Genetec Config tool to grant permissions to export video
- login to desktop config client
- Find and select the user group, where the accounts lived
- Drill into Privileges | Actions | Cameras | View live video | Export video
- select allow.
Im sorry, a little mundane. Today I was asked to update some contact information on a secure part of our website. These the the steps, for now.
- convert the provided word document to pdf, remove spaces from file name
- login to wflboces – open the District Wide Safety plan – then the sub page for the emergency contact info – this is a password protected page
- In edit mode, click on the files link, then update the appropriate file, by uploading the new one. This will change the reference at the bottom of the page – where link is placed
- Manually update the link at the top of the page to reflect file name update, since the bottom is not visible enough.
It took me a little while to work through this process. It is all changing soon, as we migrate to a new web platform.