I’ve been having a few twitter conversations about Internet startups, IT development and the ‘Boys Club’ it always appears to be. I’ve been working at a Internet Company, more than a startup, less than a powerhouse. And what was once called TeamworkPM.net is now called Teamwork.COM with the recent purchase of that very domain name.
Several Ladies have commented, one in Australia and another in California, particularly on the photos we had up for the ‘re-branding’ event.
However this is NOT the whole story, we have tried to hire NON-WHITE-NON-MALE personnel, We have held ‘Open Houses’, posted all over Cork (Home base) out of 10 (very disappointing) no persons of color, and only one woman. No one really qualified, as even the universities aren’t teaching what students need to get into the computing environment. We weren’t even trying just for ‘programmers’ none of the people bothered to even find out what we did on the Internet.
We have had interns (1 Male 1 female) in, and while useful, they were more interested in finishing school than working for us. Last year Teamwork.com set out to hire 10 people, more than doubling the staff, and we managed only 4, not all developers, from Lithuania, the Netherlands, Australia and Bulgaria. All male, all white. And it’s not our choice, it’s all that we are presented with that are educated and/or interested in working with us.
In Catholic Ireland, the Nuns still teach that Math and Science (and technology) are for the ‘boys’ and tell the girls to choose something else. Only since things like Coder Dojo’s have girls been learning coding and web development. And even there, the female component constitutes only a minority.
Until such time as females are educated, and qualified to develop code, or web working. The all-white-all-male club will be more the norm than than in the general populations. And you can’t enforce a gender balance on startups and lean companies, it would crush them. Only educations will change this imbalance. So please stop complaining about us, and others, and start sending your daughters, wives and girlfriends to school.
So if you are qualified and motivated send a CV or drop an email at Teamwork.COM
In this age of ‘Big Data‘ the masters are the ones who hide in plain sight. If you generate billions of media bits that must be parsed by the powers that be, the devil really is in the details.
During a stint in a fraud unit I learned that the trick wasn’t to read all the monitored data, but to build patterns of ‘normal’ for everyone monitored. If the patterned changed then something had changed, and an investigator was assigned.
Another article I keep remembering is an interview with a ‘Ninja Assissan’ who was quoted as;
” I never sleep in the same bed two nights in a row (he had 5 bedrooms), and he never ate the same thing for breakfast (though he always ate what he wanted)“
Hence the pattern he would generate would always be random, and therefore a consistent repeated event would be considered abnormal and a ‘red flag’ that something wasn’t normal.
So to hide, and maintain privacy would be to either overwhelm the bit watchers, or develop such a random lifestyle as to make normal pattern matching methods useless.
anyone want to be a Ninja Assissan?
I was reminded this Christmas Holiday season that computers do not ‘know‘ any human language, only binary, and that it takes humans to provide the translation from the machine to something human readable. And while most computer programing languages are ‘English’ like, they need not have to be in the English Language. It’s just what’s what happened first, and could be changed into another language at anytime.
This came to me in an inspired way, by listening to Carols, where non-native speakers were singing in latin, and other non-english speakers were singing in English, or German, or French. That you can sing in a language, and not know how to speak in it.
I suspect that is the same method that most non-english speakers program computers in ‘english like’ programing languages. By layering another translation over the programming, or like in singing, which uses another part of the brain, different from the part that provides language skills, another part of the brain is used to converse with computers. Thus making the point that people who program, do think with altered brains.
Note to self, the computer is not built to do anything other than execute instructions, hardware advances over the years have only advanced the ability of the CPU to gather instructions, it does not make decisions about what to execute, or in what order to execute them in. That is the organization of the basic boot loader, in combination with the operating system loaded.
There are no elements of artificial intelligence built into the hardware, it has no ability to reprogram itself or to change it’s wiring. External forces must be applied to force change either by altering microcode-code in the core of the CPU (should that be possible) or by execution of programs within the confines of the operating system, instructions provided by the boot loader or via operating systems loaded and executing programs. It is through those processes that constitute what a computer does, with what it ‘sees’ .
Any hope of producing the next generation of computing must therefore be a revolution in how the CPU is instructed to perform it’s instructions, what is done with the output, and any associated hardware connected to the system to perform ‘tasks’ assigned by that process. The argument that Windows, or Linux/Unix or any other operating system is better than another DOES create opportunities and restrictions uniquely to any new programming, computing Paradigm.
Anything like artificial intelligence will have to preceded by a new suite of hardware, with a new way of ‘booting’ the system and or an entirely new operating system tailored to artificial intelligence operations. Current hardware/software standardization is at once the primary blockage to any future advances to computing.
UPDATE #1: IBM Creates Custom-Made Brain-Like Chip
Well, I’ve worked myself out of another job, mostly, as I assisted the current company into the Amazon cloud. They were operating their system from a hosting environment, so they were mostly in the ‘Cloud’ anyway. And as you might guess I dislike the whole ‘Cloud’ hype as it’s mostly a marketing term. So what I convinced them to do, is improve their scalability by moving the server into the equivalent systems in the Amazon ‘hosting’ environment.
As part of the exercise the database from move from a MySQL database on Windows server, to the Amazon RDS the the webserver/application servers (windows) to EC2 instances, with additional storage in the Amazon S3 facility.
The process was, as usual, a learning experience, and Amazon still has issues with their interfaces to their corner of the Cloud. But it all works, I managed to defrag the database, and apply more indexing and SQL revisions to the point that it runs so smoothly, they don’t need me anymore. Hence the working myself out of a job. Amazon should hire me to sell their services.
I have been doing research into the nature of computers and I’ve been participating with with the phenomena know as Coderdojo. As part of my research I’ve been relearning Assembly language on several different architectures, and I’ve been experimenting with such things a the ELF Membership Card which I soldered myself and is currently running in front of me, along with a Arduino Uno. These both represent small microprocessor, very like the ones I personally started out on.
My first computer was an Apple II+ with a Motorola 6502 processor. But in any case, this act of relearning what a computer really is has made me aware of the lack of any real education ‘tools’ like I had. The sensation that is the Rasberry-Pi is fast becoming the CPU-du-jour of the developers, and as such may develop into a great educational tool. But, and there is always a but, it doesn’t stand on it’s own.
The group who developed the Pi have themselves noted that this is a developmental prototype, and that it needs to be distilled into a real educational product. It first need a keyboard, mouse, display a SD-Memory card and a power supply, to even turn it on. To make it useful as a net-workable it also needs a connection to hardwired Ethernet. It needs to have software preloaded onto the SD-Memory to boot properly. These are Geek requirements, anyone who can make this work, ALREADY has working knowledge and equipment, call it infrastructure, to make this work. What is missing in this is a standalone environment that is self contained and independent of both other systems, and other foreknowledge of computing.
My Apple II came with a keyboard, memory, built-in BASIC programing language, and displayed it’s output into a common Television, and recorded and loaded programs from a simple cassette player. All these elements were basic, everyday items in my household, and it would plug into the mains power directly, and display on a TV. It started up using Applesoft BASIC language and displayed on the screen everything I typed.
The Rasberry-Pi now needs this type of infrastructure. And while on this subject, and not to stir a pot, comes a language issue. The apple I learned on came with BASIC and in fact I still have a fondness for BASIC. The current arguments in the ‘Programming Education’ discussions are that a language like BASIC teaches BAD programming practice. Be in old, I had to remember the motivations of BASIC and was more enlightened to connect this with my reeducation about Assembly language. That was the first reason for BASIC! BASIC is and was engineered to, more or less, follow the structure of the instruction set of the CPU itself. Where language snobs see bad ‘GOTO’s in BASIC, I see machine language Conditional and unconditional ‘Branch’ instructions. Where I see a BASIC with line numbers (not all BASICs have them) I see ‘Linear’ machine instructions.
One element of the Rasberry-Pi that also misses the mark, is the nature of ‘abstraction’ while I admire the Python of the Pi, and the ‘C’ like language of the Arduino, what is missing is the distance between the learner programmer and the actual machine. It may even be a serious problem as the machine begins to look like magic, and that it can be made to do anything.
The programming of the RCA 1802 chip contained in the ELF Membership Card demonstrated what the creator of the card referred to as ‘Bare metal programing’. A simple program that I used to test the ELF with consisted of 12, 8 Bit instructions, writing (essentially) the same program for the Arduino required downloading of 998 8 Bit instructions (not including the 512 Bytes of the boot loader). To be sure there were probably a lot of libraries included in that download. Helpful, but masking the actual operations of the CPU from any real educational product. Just like that Arduino, the Rasberry-Pi will mask the CPU, and the associated hardware by a boot loader (BIOS), followed by a full, though striped to minimum, Linux kernel, and a GUI in the form of LXDE X-Windows, followed by Python language. That’s a lot of abstraction!
All these things may be irrelevant in the long term, one thing may lead to stimulation to explore the ‘Bare Metal’ hardware of the Rasberry-Pi while allowing a positive feedback with easy ‘wins’ on top of the abstraction provided. Still I believe we are missing an opportunity to produce the next generation of computer wizards. I also believe that someone needs to integrate the Rasberry-Pi into a OLPC type of device.
Having just escaped/exited from a brief encounter with a company utilizing some of Apache’s Web Projects. I keep being struck by the feeling that I’ve seen the issues before. Over a fairly long run in the IT industry I have the feeling that Apache and it’s contributors have been busy re-inventing the wheel. The Apache Hadoop as a distributed file system designed for large data sets. Apache Solr a full text search server and indexer combined with Apache Lucene supplying the search libraries. Coordinated by Apache ZooKeeper all begins to sound like a description of your average Relational Database System (RDBMS)
All these elements being created by the Apache Foundation have been, sometime in the past, been solved by most of the Relational (Big) database vendors. All the bugs and missed steps have all been made by previous developments which only reminds me of the old saw
“Those who do not learn from the past, are destined to relive them (ie repeat the same mistakes)”
It is with sadness that I had to turn off the last Sybase Instance we had running. Our last ASE server quietly shutdown on an Amazon EC2 server on Tuesday the 20th of December, never to boot again.
In all truth both Sybase instances were developer installs operating as production systems. Our two instances, operating with the 25 user limit that each was restricted to, was barely able to operate the system. But the Sybase Licensing was too archaic and inflexible to continue operating it as a small business. Thus the economics forced us to convert to MySQL.
If it hadn’t been for the previous management, who in some delusion of saving money, refused to pay the datacenter bill, forcing us to move the Sybase instances out into the Amazon cloud (EC2) in the first place we would probably have been on MySQL sooner, as that was the plan.
But the sadness remains, Sybase as a technology proved again that it would run, and run reliably, on just about any hardware, even when it was virtual, and NOT meeting the specified certified, requirements of operation. Which can’t be said for the Amazon RDS version of MySQL, which crashed spontaneously while applying an index on our live production database without warning. This having happened after weeks of testing and trial runs at operating the system on it. The only defense, the RDS instance rebooted and was available without data loss, in less time than a Sybase HA switchover would have taken, a system this production system was developed from.
So we are up in MySQL and I am now a MySQL DBA exclusively, after spending the last 25 years as a Sybase DBA and evangelist. The decision now has to be rather to remain so, or find another place of employment where Sybase remains. Those are becoming more and more rare. Maybe I should takeup MongoDB to stay at the cutting edge.
Over the past week I found myself in a situation as follows, during a migration, conversion from a Sybase Production server to a MySQL based version, I was required to ‘expedite’ a Sybase 15 ASE installation into an Amazon (EC2) instance, The Cloud!
The company has been in the position of seeking less expensive IT infrastructure over the past few years, moving from Sun Enterprise servers with ASE clustering to commodity Intel based Redhat Sybase servers with poor mans replication. The final goal became a decision to convert the expensive Sybase ASE (read inflexible licensing), to MySQL, and generally into the Amazon RDS (cloud).
The move of a Sybase ASE into the Cloud was the result of an urgent desire to terminate a data-center contract early by management. The shrinking time line for the conversion of the Sybase schema to MySQL could not be guaranteed so a Plan B had to be created. Hence, the Cloud based Sybase production edition of a production server.
To my surprise, it works! after a bit of twisting, the Redhat ASE developer installation came off more or less just like any other Sybase install. There are irregularities from a normal Linux install, but functional. Being a bit of a spindle jockey, I was surprised (happily) at the overall performance of the storage systems of the EC2 instance. And the production server is now operating in the instance. (having previously moved the app and web servers into the EC2)
This post needing a point to make, is this, while working this issue, I did considerable Googling for anyone using Sybase ASE in the cloud, and nothing! or nearly nothing. What I did find first, a press release from Sybase corporate that they were now in the Amazon Cloud, dated in 2009, and not a peep since. Nothing, no product, no advertising, no options. What a missed opportunity, it’s now easy to see why Sybase has been loosing so much market to a ‘free’ RDBMS like MySQL.