Category Archives: Science and Technology

Monsanto and the Farmers’ Suicide

Over at discover magazine, Keth Kloor has a post about the BT Cotton suicide narrative in India (http://blogs.discovermagazine.com/collideascape/2014/01/07/selling-suicide-seeds-narrative/). The blog is announcing a much longer article about this problem (http://blogs.discovermagazine.com/collideascape/files/2014/01/GMOsuicidemyth.pdf).

I am a very big proponent of genetic engineering and believe that genetically modified crops and livestock will help us in a big way to reduce our environmental impact and to produce enough for the growing population. However, I am also a strong critic of the monopolistic practices of the GM companies (Monsanto et al.).

I acknowledge that these companies spend a lot of money to come up with their GM products. They guard their secret and even put very severe restrictions on replanting or otherwise trying to propagate the changes by the farmer herself.

Vandana Shiva is a person I admire, for her zealous fight for issues of women and poor. However, I have very grave differences with her ideology, methodology and tactic. Reading the article, which charges Ms. Shiva with the manufacturing of a crisis story, or at least, appropriating a real tragedy for her fight against globalization and genetically modified crops. I am equally or more alarmed by the fast pace of economic “liberalization” in India that vastly increases the power and resources for the rich and the business while systematically eroding the social safety nets. Giving a huge multinational corporation whatever level of power over India’s food supplies is quite a scary thought. However, as I said in the beginning, I believe as a technology, GM has a lot to offer in a country like India.

While Ms.Shiva is a particularly vociferous and a bit over the top activist, one thing the article did not mention much is about the much higher cost of farming using modern farming technologies, including GM (Typically GM seeds are of much higher cost with severe restrictions on re-seeding). The tightening of credit by the ongoing liberalization of banking (which btw, is part of the same globalization Ms. Shiva and many others are against), lower levels of social security net etc., along with the perennial Indian problems of slow infrastructure growth and uneven investment all are problems that are affecting the poor in India, of which most farmers are.

Again, I have no undue worries about the technology of genetically engineered agricultural products, the Monsanto (or any other large GM seed company) way might not be the best to provide agricultural stability in developing countries. Primary concerns are the cost of seeds and the strictly commercial nature of its availability. For e.g., in case of a crop failure, earlier a farmer could acquire locally grown seeds with very little money. But, when most farms are already growing GM (for e.g. BT Cotton), non-GM farms will fair extremely poor. It is not a far fetched conclusion that, wide spread acceptance of GM food crops owned by large multi-nationals like Monsanto can have a significant effect on the country’s food security.

The problem in India is not about the technology of GM, but the way the technology is been adopted. For e.g., if the BT Cotton was an open source technology, it would have, literally transformed Indian villages in the cotton belt.

Things are not always black and white.

Tagged , , ,

Pixie Flux theory of Quantum Consciousness

In a video by Sixty Symbols (see below), Prof. Moriarty while giving a dress down of Dr. Lanza and his theory of quantum woo mentions that one could postulate pixies coming in and out of existence to create spooky effects. I think this is a serious proposition. One could come up with a hypothesis without violating any of the laws that we currently hold that, this is what actually happens.

Prof Moriarty on Quantum Woo

This is how it works. Since anything can happen without violating any physical laws within plank time (like the creation of annihilation of particle-antiparticle pairs), one could hypothesize that, there are magical Pixies that come into and poof out of existence under 10-43 seconds. These Pixies are the ones that maintains the reality and causes all kinds of quantum spookiness. The Pixies are also carriers of consciousness. Well, actually consciousness is produced by the fluttering of the Pixie wings.

This hypothesis is named the Pixie Flux Hypothesis© of Quantum Consciousness. I am claiming the copyright for this hypothesis. Unlike actual theories of physics, for which I can write a paper, get it peer reviewed and published, this can only be ascertained by brute force of an enlightened mind (mine).

Creative Commons License
Pixie Flux Hypothesis by Salim Nair is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.
Based on a work at https://bamboodreams.wordpress.com/.

Tagged

Scienciness

I was watching this video of Richard Dawkins debating Deepak Chopra. It is an interesting watch.

While watching it and responding to the comments I was trying to find the right word for what Deepak Chopra does with science. Something that expresses the manipulation and dismemberment of scientific ideas that Chopra does. Then I remembered. Truthiness! But , that is about truth. A good version of it in this case would be Scienciness. It appears that the word is used in a similar fashion by a few before me here and here for example. That is perfect. So, now I have a good word to call what Chopra does.

He is indulging in Scienciness!!!

Tagged

The Arsenic DNA Bacteria that might not be!

So, earlier last week, NASA comes out with an announcement saying that it will have a press conference on the 2nd of December to announce a major development that has implications to astrobiology.

A day after NASA announcement, Gizmodo published a highly speculative article that predicted that NASA is going to announce that they have found life in one of Saturn’s moons. Then there was just a flood gate of article, some of them with completely bizarre and unfounded speculations and rumors about the discovery.

On December 2, 2 PM, NASA announces the finding (at the same time the article was made available in Science Magazine titled A Bacterium That Can Grow by Using Arsenic Instead of Phosphorus. The authors claimed in the paper and asserted in the press conference that they have shown that these bacteria can not only tolerate Arsenic but incorporate it into their DNA structure replacing Phosphorous.

This followed another flood of reports and blog posts. There was great excitement about the news, especially because it, if correct, would mean that the life can support a much wider range of conditions than we thought. It also will suggest that the main ingredients if life as we know are not that fixed after all.

First thing to come out of it was that, it was certainly over hyped by NASA by clearly hinting it is something more than what it is. Yes, finding a life form that can substitute Arsenate instead of Phosphate is an incredible finding. But, we know life on earth is very resilient and innovative. It would certainly don’t mean that we are closer to finding exobiological entities.

Along with this, there came a series of posts by scientists questioning the veracity of the methods used by the NASA scientists (Felisa Wolfe-Simon et al). Among the many I read about this subject there are two that stand out in its clarity of presentation and scientific rigor (no, that is not to make any claims about its correctness).

The article by Alex Bradley looks at the problem from a chemist’s perspective and brings up a few very serious process and interpretation issues. He correctly points out that there were due diligence that the NASA scientists did not perform.

An even more detailed criticism by Rosie Redfield can be read here. She goes more into the methodology and points out many potential pitfalls in them.

All this reminded me of the famous Carl Sagan adage “Extraordinary claims require extraordinary evidence”.

Another point this brings to light is the methodology of science and the immediacy and openness with which it is happening. Answering requests for comment, one of the authors of the article said that the discussion should happen in peer reviewed journals, not in the blogosphere. That is a bit odd though, considering that they themselves made an effort to make this very public.

Irrespective of how this particular finding turns out, science will go on. I have noted this earlier, the usually hidden away dynamics of the process of scientific enquiry is now spilling over for every one to see. It is beautiful, exciting. Now, if only the media learn to stop seeing everything in black and white.

 

Being Agile… Part II: Never stop changing

The point I stop reading an article about agile development is when it starts quoting from the agile manifesto. No, I do not have any qualms with the manifesto; I think it is an excellent minimalist document. However, when people starts to preach about it, I tune out.

Same is the case when someone brings up a specific set of practices usually with a cute name as The Process. What I know from my last 7 years is, the only process that stays is changing processes.

This was very true in the beginning. We went back and forth and back to SCRUM as the organizing process, but played with its format and deliverables for quite a long time. We have tried weekly iterations as well as those that are longer than a month. We filled our walls with multi-colored post it notes. Built weird looking shared Excel sheets and used formal project management software with custom extensions to track the sprint.

We have been using unit testing to some extend even before our full plunge into the agile pond. But, the development and testing phases were mostly separated. After introducing SCRUM, and co-located teams, we were not sure how to interact with each other for a while. Lingering mistrust between the programmers and testers were quite palpable.

Most of our automated acceptance tests were UI tests. While these are in some sense the ultimate integration tests, precisely due to this overarching scope, they require most of a feature, including a near final UI ready before it can be run. In those early days of UI automation, the tests were extremely sensitive and just adding one control in a form would break a whole bunch of tests in mysterious ways. We actually had a campaign “does it break automation?”. (These days we ask, how can I come up with a breaking acceptance test, or a directed failure in existing tests to implement a feature or to fix a bug. More about that later.)

One of those days Michael Feathers came to our work and told us to go find fracture points and start clawing from there – I am paraphrasing. We have been looking at a really huge block of rock and wanted to see nice score marks, tap it with a soft mallet and there you get the nicely shaped pieces. He wanted us to look for fissures, cracks. That is what we did. Delphi, the language of our code base is not very refactor friendly. After 5 years of yearly releases, the original architecture was starting to form into a tangled web. (Hmm, so we have a huge rock with tangled web around it. See, I have my metaphor still going strong. How many of you have pulled the dry stems of climbers from rock surfaces? you have to be very careful or it will break. They do have a tendency to go into cracks!) Though we were a bit unconvinced about the feasibility of a test driven agile development strategy for our code base (we of course wanted to build from scratch!) we looked at our Java brethren with green eyes! They have all the tools, full reflection, managed code… We wanted it!

Once we started to pull at these stems, things started to happen. We tried to follow TDD as much as possible, but at the unit test level. However, once these frequent changes started to spill over and break automation, things became serious. Mind you, we are also working at break neck speed for a new release. While there was some consideration to the additional burden of process adoption, it did not change the deliverables much. So, if the automation is not passing, we cannot say a feature is done. If the feature is not done, we cannot get to stand in front of everyone and get an ego boost during the sprint retrospective.

Our early sprint retrospective started with a science fair-y demo of our features in a lab. We even got to sell our new ideas. After a few sprints though, we decided to do the demo in a formal fashion, power point or actual demo one at a time with a specified amount of time. Then we decided to do the demo at the team rooms and let the stakeholders walk from room to room. Then we decided to do it as a presentation for everyone, then we went back to team rooms, then we decided to record them and post them the previous day…

They all worked.

Coming back to the automation dilemma, soon it was clear that UI centric acceptance automation is not enough to support the new way of development. It was quite complex and time consuming to write and maintain. They also took awfully long to execute making it unviable to use as part of continuous integration. If we were to have some confidence in what we were doing – remember, much of the code we wanted to refactor were units that we seldom touched – we needed acceptance tests that are run with every build, or at least once or twice every day. Jealousy is a good motivation. We had been drooling over Fitnesse from the moment Uncle Bob showed us what it can do. There was no support for Delphi in Fit at that time, so we ported Fit to Delphi Win32 and started writing some tests. This was the same time when Delphi came out with a .Net version. We had to try it. We managed to compile enough of our code in .Net to allow us to cover the core and common business rules. This exercise to cross compile also created an opportunity to redefine layer boundaries by package restructuring. So we abandoned our win32 Fitnesse plan and started using the .Net version of the code to write Fitnesse tests for core functionality. This along with the Business Objects Framework that was introduced mostly through unit testing finally started to carve into the block making the cracks bigger and bigger.

We had a very supportive upper management during this transition stage. But, as the release progressed, each sprint, they will find some of the things that were supposed to be done was not done. This naturally brought up the question of accuracy of the estimates. Even though we were quite aware of the arbitrariness of the numbers we put in the remaining work column, it was never really sunk in. This gave rise to a series of estimation games and strategies. We had long planning sessions upfront. Longer planning sessions in the beginning of the sprint. More accountability of estimates. Complexity, risk and confidence factors, 0 to 1, 1 to 10, percentage… Attempts at accurate time reporting. And the all powerful velocity. We must have multiplied and divided every measurable and quantifiable aspect of the development process with every other to come up with a velocity unit. Ideal team, developer days, 2 hours allowance every day for meeting, fractional contributors.

They all worked…

Even when many of them did not bring forth the result we hoped for. But, when they didn’t, we got a chance to learn why it did not. Isn’t that the spirit of any scientific enquiry!

Tagged ,

Being agile… Part I

The noise level about agile software development is deafeningly high these days. May be it has already peaked, which is probably a good thing.

My real encounter with agile development in a production environment happened in 2003, when the company I am working for decided to adopt agile practices. We were playing with Fish philosophy before that. That was quite amusing and often gave me of an Office Space feel. It was like someone was trying to pour happiness down my throat.

When I was sitting in the early presentations and crash courses on agile, I was quite skeptical in its adoptability for our code base. Our experience of the first three years of agile was presented in the 2005 agile conference (Teaching a Goliath to Fly). 5 years after that paper, we are still agile, more so.

So, among all the other noise, I will add mine as well.

I am planning to write a series of posts about the interesting facts, realizations and revelations during this time, my reflection on the larger state of affairs etc.

First of all, I believe the spirit of agile development is its lack of rigidity. Unlike the earlier, well defined software development life cycles, which specified (sometimes including visual/text format) of artifacts that are to be used in each stage of the development, agile presents some basic principles. There, of course, are attempts by many to present such over specified artifacts in agile as well, but, it is an exception, not the rule.

It all comes down to the realization that software development process is quite messy. But any complex human endeavor is messy, especially ones that involve a lot of abstractions. In many engineering practices, we have managed to come up with systems and practices that controls this messiness to a very large extend. However, if you have ever associated with a construction project, it is easy to realize that with all the plans and architectural drawings and RFPs etc., the final product turns out to be quite different from our original conception about time, resources and form. But, since we know the costs involved in making a change, we just try to live with it.

In case of software, there is a common assumption that it can be altered quite easily – that it is soft, and malleable. It is also quite abstract even in its final form. It is a model of the real world, a simulation of a series of behavior patterns of the store front clerk, physical movement of a bunch of trucks. They communicate tersely with the user, mostly in verbal, or in highly iconized visuals. We create this model through a series of layered abstractions from real world observations, verbal descriptions, mathematical equations and finally the implementation tools (programming language, testing tools, modeling tools etc.)

Many early attempts at controlling the messiness of software development did so by controlling change. Mimicking other engineering disciplines, we tried to create detailed design artifacts, elaborate triaging procedures for change control and sometimes downright scare tactics! Every stage of the development created these huge walls of artifacts between its predecessor and successor. And as with any wall, it seeded animosity. I remember, in my previous job, I met an actual tester only after several months. But, even before meeting one in flesh, I was quite happy to trash them and developed quite a distaste in their ways. We did the same to the “architecture team” whom I never met in my two years there.

A significant symbolic gesture we did, prompted by Ken Schwaber was to demolish the walls of our cubes. Our cubes had 5ft high walls and you could only see the person sitting right across you. The window cubicles were a status symbol. It also separated the programmers from the testers and from the BAs quite effectively by placing them in different areas of the floor. It was a shock to a lot of people to lose the privacy of their cubes. Many complained the new “team rooms” are too noisy, Ken complained that we are too quiet. (The team I am in now has the honor of being the noisiest!)

We were officially following SCRUM and agile. But, the nature of our products made this adoption quite challenging. Since we are developing packaged software, there is not a lot of direct and immediate interaction with the end users, and the release cycle is typically span an year or more. The adoption of new versions by customers is even slower. There were serious doubts about the predictability of iterative process. We were changing things so very often. Trying new ways of planning, inventing complex sticky notes schemes, pairing and not pairing, fighting over differences between unit testing and acceptance testing.

One thing we did not do was adhere to a single set practices.

Tagged ,

Fallacy of Exaggeration

In an article about the brain (a wonderful article) Carl Zimmer has this quote.

The brain is, in the words of neuroscientist Floyd Bloom, “the most complex structure that exists in the universe.”

Now, I understand the awe we feel looking at the complexity and ingenuity of our brain, but the most complex structure in the universe?! Biological systems tend to accumulate entropy to create complex systems, but there are many other phenomenon in the universe that has much more entropy. Think about the nuclear fusion in the center of starts, and how about the super massive black hole! Now think about possibilities of life (as self replicating, self regulating systems capable of building complexity) in the rest of the universe. Some of it might have had billions of years to evolve. May be there are sentient networks that cover whole solar systems.

How do you even conceive even a thought experiment to verify that statement?

I have to disclose that I did not read Bloom directly and doesn’t know the larger context from where it is taken.

Such statements, like any other unverifiable statements, should be avoided when someone writes about science.

Tagged

What is not evidence!

In most discussions that I have with people of fervent religiosity, one thing that frustrates me is the lack of basic understanding of what constitutes as evidence. So, today I found this gem of a list in the comment section of a blog. It is a pretty comprehensive list of types arguments usually used from the religious people to defend their faith.

Allegations are not evidence.
Hearsay is not evidence.
Unsubstantiated claims are not evidence.
Personal revelation is not evidence.
Anecdotes are not evidence.
Rumors are not evidence.
Wild speculation is not evidence.
Wishful thinking is not evidence.
Illogical conclusions are not evidence.
Disproved statements are not evidence.
Logical fallacies are not evidence.
Poorly designed/executed experiments are not evidence.
Experiments with inconclusive results are not evidence.
Experiments that are not and cannot be duplicated by others are not evidence.
Dreams are not evidence.
Hallucinations/delusions are not evidence.
Experiments whose methodology is not open for scrutiny are not evidence.
Data that requires a certain belief is not evidence.
Information that is only knowable by a privileged few is not evidence.
Information that cannot be falsified is not evidence.
Information that cannot be verified is not evidence.
Information that is ambiguous is not evidence.

Now, only if everybody play nice and follow these basic principles. I am not very optimistic. Because, if the theists, especially the fanatical one used any kind of logic and rationality, they will find their own arguments crumble like dry Puttu

Tagged ,

NASA postpones Dawn… Again!!!

NASA today decided to postpone the Dawn Asteroid mission again. Read the story here.

The mission is very interesting in the sense that it is the first time we are visiting an asteroid closely. Remember, in most space sci-fi thrillers, mining colonies in asteroid belt is a common story line. It looks like instead of rare, exotic materials like Naquida or something like it, but for water.

My interest in the mission is not colonizing the asteroid belt, but the use of several new technologies that NASA has been putting in the back burner for so long, the ion propulsion drive being the first of them.

I still haven’t figured out why NASA takes at least 100 times more money, time and other resources to do something that you can do otherwise. But hey, it is NASA